Design Ethical Machines for Children (Introduction)

We are pleased to present our vision for designing ethical machines for children from 2026 to 2030.
Why
From 2015 to 2020, we focused on understanding how young children (aged 7–11) navigate data privacy online, and how we can better support them. This work led to the creation of our KOALA Hero app and toolkit, which helps parents and children explore data privacy risks together and make informed choices about the apps they use on their mobile devices.
From 2021 to 2026, we deepened our understanding of how children in this age group talk about data privacy and perceive ownership of their data online. We observed increasing confidence and nuance in how they understand data autonomy. By working with older children and comparing perspectives across age groups, we identified a strong and consistent desire among children to have greater control over their data. This led to the development of our CHAITok app, which mimics the experience of using short-form video platforms while giving children significantly more control over their data.
While these research efforts have given us valuable insights into children’s lived experiences with digital technologies, they have also highlighted a critical gap: there are still very few ethical digital options that allow children to benefit from technology without being exposed to algorithmic manipulation, addictive design, or associated social anxieties. The rapid rise of generative AI in early 2024 has further intensified both the risks and the opportunities—making it even more urgent to ensure that children can flourish in a digital childhood, rather than be exploited by it.
What is EMBER?
EMBER is our initiative to create empathetic and respectful AI technologies for children (anyone under 18), enabling them to grow up safely while retaining their agency.
We design EMBER technologies around five core principles:
- Empathetic - Children’s needs and vulnerabilities are recognised, acknowledged, and supported, not exploited.
- Mindful - Children are given the confidence to explore and connect to themselves, others, and nature.
- Balanced — Children feel control and joy while with an option to pause, disengage or seek for alternatives.
- Empowering — Children feel in control, can self-regulate, or ask for help.
- Respectful — Children’s preferences and voices are heard and valued, not manipulated.
We believe EMBER must be implemented at both the algorithmic and user experience levels:
- We need to design AI models and agentic systems that embody EMBER principles
- We need to create user interfaces that reflect and reinforce these principles
Implementing EMBER machinese for children
How can an EMBER application look like?
In the next blog post, we will walk through the useful child-centred scenarios developed by KORA and show how an EMBER AI not only provides a safer option for children, but also more empathetic, mindful, creating the sense of belonging, empowering, and reflectful.