SINGAPORE, Oct 5 — By March 2026, Google will prevent users in Singapore estimated to be under 18 from downloading inappropriate apps such as dating or sexually explicit platforms from its Play Store, The Straits Times reported yesterday.
The move comes as part of new age assurance requirements introduced by Singapore’s Infocomm Media Development Authority (IMDA) under its Code of Practice for Online Safety for App Distribution Services.
The same code applies to major tech players including Apple, Huawei, Samsung and Microsoft.
Google said it will also introduce restrictions on YouTube to promote healthier digital habits among teenagers. These include reminders to take breaks and limits on repetitive viewing of videos that glorify extreme body ideals or social aggression.
Ben King, Google Singapore’s managing director, said the new safeguards are designed to go beyond parental controls.
“This isn’t just about giving parents more tools. It’s about our systems automatically providing an added layer of protection to ensure that every young person has age-appropriate experiences,” he said at the company’s Safer with Google event.
Under the new system, Google will use machine learning to estimate a user’s age based on search and viewing behaviour. Users deemed underage will automatically have safety filters applied across multiple Google services.
On Google Search, explicit or violent material will be blocked by default, while on Maps, the Timeline feature that tracks location history will be disabled to reduce data collection. Teen users of the Gemini AI assistant will be prevented from generating images, and its responses will be fact-checked and include source citations.
If adults are mistakenly flagged as under-18, they will be able to verify their age by submitting a government-issued ID or selfie.
Google’s head of government affairs and public policy for Singapore, Rachel Teo, said that even users browsing without signing in will have certain safeguards, such as blurred explicit images in search results and YouTube’s restricted mode.
While the measures aim to create safer digital environments for young users, similar roll-outs in the United States and Britain have sparked privacy concerns due to the use of algorithms to estimate users’ ages.
You May Also Like