Exploring tech for child helplines
We recently co-hosted with Child Helpline International a special virtual event for national child helplines to explore technology advances that will empower these groups to do more for the world’s children (view the slide deck for the event here). This workshop built on a previous event we hosted in late 2022 in Stockholm, Sweden, which was very well-received by child helplines. The helpline community assists many millions of children and it is no surprise that better technology is needed to continue scaling their ability to serve.
Of course, no conversation like this can happen in 2024 without talking about AI. And, there’s no more important voice to trust than your peers in the field. We talked about AI and heard from several tech helpline leaders about how they are using AI effectively today.
The roles AI and humans play for helplines
I kicked off the session with an AI overview entitled “The Future of Tech for Child Helplines is: Humans!” The essence of helplines is empathetic human-to-human contact, and our tech applications need to support and enhance that human connection. Helplines often hold critically important datasets by and about children in their countries because of the unique position they hold in their communities. We need to use that data for better assisting children in need, while also securing that data to protect it from misuse (and harms to the children whose data is captured).
There are exciting AI applications ahead: better hearing the collective voice of children, handling routine tasks to enable helpline counselor efficiency, supporting and training counselors, intelligent routing and triage, better quality and faster data collection, and more. I highlighted the importance of the network, and shared that our Aselo team is partnering with multiple helplines in North America, the Caribbean and Africa to build open source AI technology together. Our learnings and the resulting software will be freely available to all Child Helpline International member helplines. AI is not cheap or easy to develop, and joint action and knowledge sharing is critical.
AI Initiatives for Child Helplines
Darren Mastropaolo, the VP of Innovation and Data at Kids Help Phone, Canada’s national child helpline, shared their vision of AI for Canadian children and youth. Their dream is to have AI assist in personalized recommendations, not to sell products, but to assist help seekers on their wellness journey. They already have developed an AI-powered tool (in partnership with the Vector Institute, Canada’s leading AI institution) to identify the primary issue discussed in a text conversation with over 90% accuracy. In addition, they plan to use data to identify service deserts: communities with a high burden of youth need without adequate local services to meet those needs.
Charlotte Smerup from Børns Vilkår, the Danish national child helpline, presented their AI initiatives. They have created a Counselor’s Assistant, which is a guide by the side providing real-time helpful hints to volunteer counselors based on AI-identified topics in active conversations. As a leader from a peer helpline, Charlotte gave her realistic assessment of both opportunities and challenges with AI. She noted that a lot of good quality data is necessary, that it’s essential to invest in anonymization to address privacy concerns, and that AI innovation is expensive. Her team have been working on AI applications now for about five years, and feel that it has been a boost for their programs.
Our closing speaker was Sam Dorison, cofounder of ReflexAI. His company is a spin-off from the Trevor Project, which operates a crisis response helpline for LGBTQ+ young people. Sam and his co-founder had pioneered the use of AI for training volunteers several years ago, and they wanted to bring this capability to a greater range of helplines (including for veterans). They focus on using AI to improve quality of helplines, both for training and ongoing feedback. He pointed out the importance of role-playing in volunteer training, and mentioned an issue that hadn’t occurred to me. Not only do training simulators reduce demands on the limited number of experienced human trainers, but he pointed out that role-playing a suicidal teen day-in and day-out was psychologically wearing for their trainers. The AI simulators increased the number of volunteers trained as well as improved the mental health of crisis helpline trainers.
AI as a tool to help counselors and young people contacting helplines
The excitement in AI’s potential was balanced with reality checks from AI-using organizations with years of real-world experience. Nobody was advocating to fire humans and replace them with robots. Everyone was trying to use the technology to help real human beings, both the young people in need as well as the counselors helping them. I’m looking forward to the next time we bring this kind of expertise together in the coming years to share their latest learnings!












0 Comments