Agalia Tan
@agaaalia
Senior Strategist at We Are Social
RADAR Chapter Lead for Singapore
Agalia Tan
@agaaalia
Senior Strategist at We Are Social
RADAR Chapter Lead for Singapore
“Closure is a thing of the past. One of the emergent qualities of the digital culture millennials shaped is that nothing ends any more. Wars and pandemics drag on; aging bands keep touring in a perpetual state of reunion rather than breaking up; politicians circle the drain into their eightie... See more
In the ambient future, brands will compete not for screen real estate, but for mental bandwidth and emotional resonance.
We’re moving from gadgets in our hands... to intelligence all around us.
One of the clearest signals from Nvidia, OpenAI, and even smaller startups was the rise of the Mixture of Experts (MoE) architecture.
Instead of a single, giant LLM handling every task, MoE uses small, specialised models — each tuned for a specific domain (e.g. financial language, retail SKU logic, creative copywriting).
It’s cheaper: Not every query hits a massive model, reducing compute costs.
It’s faster: Only relevant experts “activate,” improving speed.
It’s more precise: Specialists outperform generalists in nuanced tasks.
Closed models → fast experimentation
Proprietary models → scalable differentiation