A recent article from The Times highlights the journey of Liang Wenfeng, founder of DeepSeek AI, who has become a national hero in China. His company's advancements in cost-effective AI models have not only boosted national pride but also kickstarted an AI wave in China, inspiring other models like the ERNIE X1 which boasts ‘high EQ’. I’m both skeptical and excited: skeptical mostly because of the reliability of Chinese methods (or as Trump calls it ‘Chyna’) and that these models seem to compete on cost compared to utility; and I do believe that even in the AI space there will always be a need for an Apple/Android market model: an expensive high-end elite version (that’s Apple) and the mass market Android (for the rest of us poors). ChatGPT is currently front-running that race, with DeepSeek (or its peers) serving as that competitive foil to the former’s dominance. For one, my students are hardly using DeepSeek but a few brave ones are dipping their toes into the welcoming pool of ChatGPT. I’m also excited because disruption is the nature of progress, and AI is its latest, most unpredictable herald.
I’ll mention it here just as I have elsewhere: AI won’t steal your job, but the ones who are AI-natives will. Just as how the sea nymph Thetis dipped her son Achilles into the river Styx to grant him immunity and strength, those pioneers who are willing to wade into the chilling AI waters will later acclimatise themselves and be far superior at navigating it than those who shun its increasing pervasiveness in society. Of course, we all know how that story ended: more and more frequent AI use also spells an increasing insularity from the social and human world (always a fear of mine).
Meanwhile, China is taking no half-measures: the country is taking significant steps to incorporate AI education at all levels. Starting this fall, Beijing has mandated AI education across all compulsory education levels, from elementary to high school. Schools are required to offer at least eight hours of AI instruction per academic year, either as standalone courses or integrated into existing subjects. It’s clear what the country’s intent is: rather than spending time playing catch-up and developing the next ChatGPT, we’d wait this cycle out now, and be so prepared for the next one such that the rest of the world has no choice but to abide by China’s rules. Across the Pacific, the U.S. trudges forward, burdened by the weight of bureaucracy, ensnared in debates over ethics and regulation—an old titan, slow to pivot, unsure of its own footing.
This is not to say that the gap is easily surmountable. The U.S. economy is currently dominated by tech companies—and Trump himself has cozied up to many infamous tech billionaires who seem keen to leverage their relationship with the White House to push their products into more (and smaller) hands. RIP, KOSA1. Beijing seems to be betting on discovering its own cohort of government-friendly tech CEOs by sowing the seeds at the playground and in the classroom—a gamble which I’m sure will reap dividends 15 years later when these kids discover that their government-issued iPads have so much more utility than just streaming the Chinese equivalent of CocoMelon.
Similarly, Estonia has launched the AI Leap 2025 initiative in collaboration with tech companies like OpenAI and Anthropic. Beginning in September 2025, this program will provide AI learning tools to 20,000 high school students aged 16 to 17, with plans to expand to vocational schools and younger students in the following year. This seemed like a cookie-cutter educational tech initiative until I realised that this was AI-specific, rather than just a broad-based technological push. I also raised an eyebrow when I saw this in the article:
GovInsider previously spoke with Kallas when she met with Singapore’s Minister of Education to explore cross-border collaborations to leverage on respective strengths: STEM-focused Singapore and digital-first Estonia.
For all of our country’s attempts at being a SmartNation, it feels like we lack the gumption or vision to bring in the big initiatives necessary to position ourselves in pole position for a future dominated by A.I. and such future technologies.
Yes, there is a need to access and monitor (we are always monitoring the situation) the implications of such technologies on both our citizenry and our national interests. Yes, there are other real-world bread-and-butter matters like housing and cost-of-living issues. Yes, there is a need to protect our kids in the real world from actual bullies and to sensitise them online to cyber-literacy and safety. I recently had to teach a lesson on why Voyeurism—both in the real world and in the reel world—is bad. I balked not only at the topic—is it not common sense?—but also at the state of the lesson materials. They merely skimmed the surface rather than delving into potentially real and dangerous case studies like deepfakes (can we have A.I. superimpose one of my classmates’ faces onto the body of an adult actress?), digital consent (should we send someone a d*ck pic? The answer is not as simple as it seems, according to my students), and even digital necromancy, where apparently, A.I. can now parse through reams of WhatsApp chats to give a functioning simulacrum of our deceased love ones. We are verging into the reality proposed by Netflix’s Black Mirror, and the future sure doesn’t seem bright.
I’m expecting the ‘Keep me company’ use to skyrocket in the next five years.
Yes, Singapore is world-renowned for our educational excellence; and yes, we are actively integrating AI into our education system. Let’s be real, though: the Student Learning Space (SLS) platform which ‘incorporates AI-enabled tools to enrich learning and support teachers’ seems more conservative in its implementation rather than one that encourages self-exploration and discovery for our students. The Ministry of Education’s AI-in-Education (AIEd) Ethics Framework speaks more of ‘guiding the safe and responsible development of A.I. systems’, and ‘ensuring that A.I. practices are in alignment with professional and ethical beliefs’. Are the two necessarily opposed? I don’t think so, but when we have a systems approach to managing what is essentially a disruptive tool, it further reinforces the conundrum that Singapore faces—in a world that is increasingly chaotic and whose mantra is increasingly Move Fast and Break Things2, can we really afford to play it slow like the much bigger countries? Where is this vision? If anything, our nimbleness and speed as a small country is a comparative advantage we have not fully leveraged: things like the SimplyGo fiasco, the platitudes towards data protection and our NRIC numbers, and the fact that we have yet to fully transition to a digital payments system in the vein of China have me questioning how Smart our Nation truly is. There is a reason why more students opt to be doctors and lawyers than entrepreneurs: when our education systems privilege the maintenance of the status quo and discourage failure and experimentation, all we get are adults who cannot weather uncertain storms.
Referring to AI, I am especially frustrated by the comment made by the Education Minister on an ‘effective class size of one’, and how technology can ‘complement the capacity of teachers’. I can tell you for sure that if I had ChatGPT as a student, my academic life would have been so much easier. Again, the ones who can leverage A.I. will outperform those who cannot. The Minister goes on to add that ‘one possible way to manage class sizes is to have smaller classes for students who need more supervision from teachers, with larger ones for students who can undergo self-initiated learning with the aid of technology’. Isn’t technology going to be used in smaller classrooms, too? I do believe that’s setting up a false premise that sidesteps the elephant in the room: that reduced class sizes contribute to better educational outcomes. Never mind the vague studies that MOE loves to cite: no teacher will ever prefer a larger class size to a smaller one. Underlying the notion that teachers utilise technology to scale is an inherently industrial belief (read: outdated by 50 years) that students are fungible when they are clearly not. Even OECD data suggests that the average primary school class size in OECD countries is about 20 students, half that of what many Singaporean Primary schools have. A webinar that reaches out to 200 students in a mass lecture ignores individual learning needs, and yet this is the hill we want to die on. I have three classes of 28 this year, and as lovely as some of the students are, it’s still a numbers game at the end of the day when it comes to developing them as students and humans, interacting with them on the regular beyond academic hours, and marking (oh god the marking). Introducing edTech tools might be gravy on the train, but none have so far impressed me in teaching the skills I want my students to learn: critical thinking, the courage to fail and the gumption to adventure, story-telling, and the ability to emotionally regulate and self-soothe. These are skills that are our niche, and while edTech tools are certainly excellent at fostering collaboration, discussion, and efficiencies in learning, they can’t do any of those human skills. The teacher IS the technology.
The prevalence of tuition agencies in Singapore further complicates an already messy educational environment. With the integration of A.I., some tuition agencies are exploring A.I.-powered learning platforms to provide personalized tutoring, immediate feedback, and immersive learning experiences. For instance, Tutorly.sg offers an AI-driven platform that serves as a personal tutor, providing study notes, quizzes, and practice questions aligned with Singapore's MOE syllabus. Now, this is already being trialled in schools, but again, I do believe the extent of the tech/A.I. roll-out is half-hearted: either force everyone to get on board (which MOE has done in the past) or stop shifting the blame to the private tuition agency for a system which they have setup. MOE spent S$3.2 billion in government expenditure in 2023, while Families in Singapore have spent $1.8 billion on private tuition—an approximately 64% increase in just 10 years. Yes, the aims are different between private and public tuition. The ones with the most to lose? The students who can’t afford to buy their way into AI-enhanced learning, and we do have a duty of care for the most underprivileged.
I was recently speaking to some ex-colleagues who have since left teaching about how I secretly wish that we’d get a Musk-ian (is that an adjective, now?) DOGE-like cleanse of our bloated teaching fraternity: the people most aggrieved with mediocre teachers are usually the others working in their department. It's a major push factor when the ‘experienced’ teachers are coasting and earning twice to thrice your salary and the younger ones are forced to take up other school duties and projects in the name of ‘visibility’. Is it any wonder why so many are working towards a career in the increasingly lucrative private tuition industry, or becoming FAJT3? It’s a vicious cycle I don’t foresee stopping anytime soon.
As AI continues to permeate the educational landscape, concerns about teacher redundancies arise. I do wish that my job could be automated to a degree, but I also want to stand and journey with my students, girding them for a future where they are increasingly atomised, where every aspect of their lives is sold for parts and commodified. Just like Thetis at the start of this post, we risk doing the same to our students: dipping them into the Styxian currents of technological advancement while forgetting that the most crucial part of their humanity—their uncertainty, their hesitation, their ability to wrestle with the unknown—must still be held by human hands. Because who will teach them how to make peace with silence? Who will show them the art of sitting with an unanswered question, not as an error in need of correction, but as a possibility to be held? AI can instruct, assess, and predict, but it does not linger in the quiet uncertainty of a student on the verge of understanding.
Thank you for reading this far, and keep being the brightest star for others.
KOSA, or the Kids’ Online Safety Act, is a federal bill designed to protect children from online harms. In December, the Senate released yet another version of the bill—this one written with the assistance of X CEO, Linda Yaccarino. This version includes a throwaway line about protecting the viewpoint of users as long as those viewpoints are “protected by the First Amendment to the Constitution of the United State.
A motto popularized by Mark Zuckerberg at Facebook, which emphasises speed and experimentation; suggesting it is better to make mistakes and disrupt technologies than to play it safe at a slow pace.
Flexi-adjunct teachers: former teachers who had retired or resigned.