Health Minister Ong Ye Kung touring the inaugural AI in Healthcare Technology Showcase at NUS on Nov 13. PHOTO LIANHE ZAOBAO
SINGAPORE – Singlish has long been a hurdle for artificial intelligence (AI) voice assistants, given the many accents and languages used in Singapore.
But one healthtech project, named AI Singapore Speech Lab, has made strides in teaching AI models Singapore’s spin on English.
Its Singlish-savvy large language model has been deployed in clinics and is being tested with the Singapore Civil Defence Force (SCDF) to help 995 call responders to transcribe calls and prompt call-takers to ask vital questions during emergencies.
In a recent breakthrough, the project’s AI models can dynamically transcribe multiple local languages, including English, Malay and Mandarin, said Professor Chng Eng Siong, an expert in computing and data science at Nanyang Technological University.
The project was presented at the AI in Healthcare Technology Showcase by national programme AI Singapore at the National University of Singapore on Nov 13, alongside 15 other tech projects, including chatbots and digital assistants designed to enhance efficiency in the healthcare sector.
In Singlish, it is common to mix several languages and slang from local dialects. This blending makes the patois unpredictable for AI models to interpret and a challenge for programmers.
“Singlish has been one big problem for local developers,” Prof Chng told The Straits Times. “Our accent and slang are difficult for existing AI models to catch.”
“You also have locations in Singapore that most speech-to-text models cannot capture,” he added, citing examples such as Choa Chu Kang, Tampines and Jalan Bahar – words not in the English dictionary.
Deployed at SingHealth Polyclinics, the AI model supports medical staff during interviews with patients. Prof Chng said that for privacy reasons, the AI model is isolated from the internet and performs transcription entirely on the user’s computer.
Live demonstrations show the AI model’s ability to transcribe multiple languages and heavy Singlish terms dynamically, providing more accurate transcriptions for call-takers to refer to.
The Singapore market may be too small for tech companies to invest in to address local code-switching, a gap the project aims to fill, said Prof Chng. With a roughly 90 per cent accuracy rate, the recordings made by the model are reliable enough for call-takers to make sound judgment calls and analysis, given that there is still a human in the loop, he added.
AI Singapore and other industry partners have provided the speech lab close to $5 million over at least five years.
The same technology is being tested with SCDF under another AI Singapore-funded project called the Intelligent Telephone Triage in Pre-Hospital Emergency Care.
The AI model transcribes calls from the public to emergency responders and automatically fills out a form that call-takers use to quickly relay information to other departments. The information includes details such as the caller’s name, age, location and specifics of the incident.
The system also prompts call-takers on follow-up questions to ask and suggests actions, such as calling for an ambulance in dire situations.
Medical AI researcher He Kai said the AI model helps responders improve the speed and accuracy of critical details noted during a call as any mistake can waste resources or pose a threat to patients in need.
Teaching AI to understand Singlish – as well as a long list of medical terms – has taken heaps of data.
The researchers said the data includes more than 300,000 samples of call logs from the authorities and public databases, as well as data from the Infocomm Media Development Authority’s National Speech Corpus – the first large-scale database of Singapore English. The database was compiled to help researchers and developers of speech-related applications improve understanding of locally accented English.
Communities around the world are working to ensure that AI models are tuned to specific cultures, ensuring better representation for local groups. For example, Iceland is collaborating with ChatGPT maker OpenAI to preserve the Icelandic language and help non-English-speaking communities access AI more effectively. Likewise, AI Singapore is developing a series of large language models called Sea-Lion (South-east Asian Languages In One Network), tailored to the languages and cultures of the region.
In his opening speech at the healthtech showcase at the Shaw Foundation Alumni House, Health Minister Ong Ye Kung encouraged the healthcare sector to adopt AI strategically.
The Ministry of Health, for instance, is deploying AI tools, such as administrative tools and imaging technology, to help spot diseases early across the sector, to drive efficiency and improve people’s health, he said.
In one project, AI is being trained to analyse wounds, and deliver timely reports and recommendations for patients with chronic conditions.
AI is also being tested at the Singapore General Hospital (SGH) as an additional layer of checks for pharmacists and physicians and to suggest choices of medication.
The current model, developed as part of a project named NexRx.Ai, is trained on 80 per cent of medications encountered by diabetic patients and is designed to be able to explain its recommendations to users to ensure its decisions are transparent, said SGH principal clinical pharmacist Jasmine Ong.
Medication errors affect 1.3 million people annually, said the World Health Organisation, which estimated that the global cost of medication errors is at least US$40 billion (S$53 billion) annually.