IN-cube at Nanyang Technological University uses NimbleLabs to uncover patterns of online information exchange that shape public trust and engagement in Singapore.
In our latest collaboration, Centre for Information Integrity and the Internet (IN-cube) at Nanyang Technological University, Singapore (NTU) became the first enterprise customer to utilize NimbleLabs, our no-code AI platform built for clinical research teams. This collaboration builds on a partnership that has evolved over two years, from early research to enterprise-scale analysis of digital engagement patterns.
Together, we analyzed more than 550,000 messages and 10 million words across Telegram groups to understand how online narratives of distrust toward healthcare institutions form and spread, and how that insight can guide more effective public-health communication.
IN-cube leads research and analyses on internet use, and part of this is understanding what information is being exchanged online. Some of this information may actually be fake news, which has implications on what the public knows or believes in across different issues, including those related to health. By improving how institutions recognize and respond to signals of disengagement, IN-cube’s work supports stronger relationships between the public and other stakeholders, including healthcare institutions.
Enterprise Collaborator
One of the main challenges IN-cube faced was the need for explainable, scalable AI pipelines that could process sensitive social data while ensuring full de-identification and privacy compliance. Using NimbleLabs, IN-cube researchers processed a much larger and more complex dataset—without writing a single line of code. The platform’s agentic AI pipeline handled everything from data ingestion to anonymization to model training and explainability.
Rather than waiting months, the team trained and validated multiple large language model classifiers within days, achieving an F1 score of 0.866 and 85.2% expert label accuracy on their top performing classifier.
What They Found
The results reinforce what we have found in our prior collaborations: the people most hesitant to seek care or engage with public institutions, such as the healthcare system, are often concentrated in small but active online clusters. Their posts, marked by distrust or frustration toward institutions, appear in short, high-intensity waves that can ripple across groups before fading. These cycles make early identification critical, allowing health communicators to intervene before disengagement deepens or spreads.
Why It Matters
For IN-cube, the ability to analyze half a million posts in days, not months, marks a major shift in what researchers can achieve with responsible AI. For us, it demonstrates the promise of NimbleLabs: bringing production-grade, privacy-safe machine learning to teams that don’t have dedicated engineering resources.
“Our collaboration with NimbleLabs made it possible for our team to analyze complex data at a scale and speed that was not possible before. It allowed us to identify clear patterns of where and why people disengage from legitimate news online, which has implications on what the public knows or believes in across different issues, including those related to health.“
Prof. Edson Tandoc, Director, IN-cube@NTU
By supporting IN-cube’s research on information flow and engagement, our collaboration demonstrates how responsible AI can strengthen both information integrity and health system participation, turning complex data into insights that advance community well-being.
Looking Ahead
Together, Nimblemind and IN-cube are proving that responsible, explainable AI can accelerate social research without compromising rigor or privacy.
Spanning two years of collaboration, this project reflects a shared commitment to using AI to better understand and strengthen public trust in healthcare.
As our first enterprise customer on NimbleLabs, IN-cube is setting the standard for how universities and research centers can operationalize AI, transforming months of analysis into actionable insights that strengthen public trust and increase healthcare engagement.
