Insights

How AI enhanced our strategic research without compromising authenticity

Written by ClearPoint | May 3, 2026 11:04:59 PM

When organisations need deep stakeholder insights to inform strategic decisions, traditional research timelines often don't align with business planning cycles. The emergence of AI in research has shown both challenges and opportunities — reinforcing the importance of transparency and accuracy. But when applied thoughtfully, AI can dramatically accelerate analysis while preserving authentic stakeholder voice.



The challenge: strategic research at business speed


The University of Otago approached us to strengthen their existing learner journey map, capturing fresh perspectives from students and key stakeholders to inform their digital strategy and university-wide improvements. This strategic work needed to capture authentic experiences across diverse cohorts while delivering insights rapidly enough to influence current planning cycles.

Working with a significant volume of content, including 17+ session transcripts and over 400 individual digitised sticky notes, we delivered a comprehensive analysis in just 9 weeks, a process that would typically take 4–6 months.



Enhanced research capabilities in action


Data privacy and cyber security: our "safety first" approach

  • We recognise that using AI in research requires rigorous protection of stakeholder data. Our methodology is built on a foundation of enterprise-grade security:
  • Secured Environments: We use only corporate-licensed AI platforms (Google Gemini, Claude Team, and ChatGPT Enterprise) governed by enterprise agreements.
  • Data Training Protection: These agreements guarantee that client data will be kept confidential and never used to train public AI models.
  • Compliance & Standards: Our AI policy is part of a security framework aligned with ISO 27001 and fully complies with the NZ Privacy Act 2020.


Human accountability
AI tools enhance our capabilities but never replace human judgment; every insight is reviewed, validated, and owned by our professional researchers.

Authentic voice preservation
AI helped translate participant feedback into actionable language without losing authenticity. Raw feedback could be reframed into strategic language while preserving the urgency that drives institutional change.

Contextual intelligence that preserves individual voice
Enhanced analysis enabled us to maintain context that would be lost in traditional approaches, connecting specific feedback to participants' backgrounds, circumstances, and experiences. Instead of treating responses as anonymous data points, we could instantly recall who said what and in what context. This revealed that seemingly similar concerns had completely different underlying causes depending on stakeholder circumstances.

Pattern recognition at scale
AI identified consistent themes across different participant groups:

  • digital systems challenges affecting multiple cohorts
  • process confusion, particularly impacting specific student groups
  • structural barriers that compounded difficulties.

These patterns emerged clearly and efficiently.

Enhanced analysis enabled us to maintain context that would be lost in traditional approaches, connecting specific feedback to participant backgrounds, circumstances, and experiences. Instead of treating responses as anonymous data points, we could instantly recall who said what and in what context.

Real-time strategic value
Within hours of each session, university leadership received meaningful summaries. When participants highlighted decision-making processes more complex than the university's current processes acknowledge, we could flag these insights immediately, rather than waiting days.

The critical moment
ClearPoint maintains access to multiple AI platforms, each with different strengths. When our initial choice began fabricating content despite explicit verification instructions, we shifted to a platform better suited for specialised analysis (Claude) and implemented systematic verification protocols. This delivered zero fabricated content in the final deliverables, compressed the timeline while maintaining quality standards, and provided comprehensive insights that enabled the university to begin strategic discussions and planning for immediate implementation before year-end, rather than waiting until the following academic cycle.



What made the difference


AI handled systematic processing:
transcript analysis, pattern recognition, and cross-referencing.

Team maintained strategic oversight: cultural sensitivity, stakeholder communication, and quality assurance.

Three-layer verification: AI cross-referencing + human contextual review + source triangulation

Capabilities previously impossible
This approach unlocks capabilities that traditional methods can't match: complete quote traceability, pattern analysis across recorded sessions, and the ability to query complex datasets conversationally. You could ask 'Show me all concerns about financial processes from first-year students' and get instant, verified results.

But perhaps the most valuable aspect is being able to revisit sessions months later and ask: 'Give me a quick summary,' 'Did anyone mention accessibility concerns?' or 'What patterns emerged?' Almost like a time machine, being able to revisit a past event. This transforms static transcripts into dynamic research assets you can interrogate indefinitely, invaluable for sharing findings with new stakeholders or when strategic priorities shift.



What organisations should expect from enhanced research


Continuous platform evaluation:
Your research partners should use multiple platforms with distinct strengths that evolve over time. What works best for processing stakeholder interviews might not be optimal for cross-referencing insights or translating feedback into strategic recommendations. Seek partners who regularly assess and adapt their tools.

Build verification processes: Look for partners who build accuracy checks into their core processes rather than adding verification as a final step. This means developing approaches that prioritise authenticity over speed, cite specific sources, acknowledge data gaps, and flag when claims cannot be verified.

Transparent methodology: Your research partner should communicate their enhanced capabilities during project scoping, explaining how advanced tools will accelerate analysis while maintaining rigorous quality standards. This transparency should build confidence in the approach rather than create concerns about methodology.

Professional accountability: Enhanced tools should amplify professional expertise, never replace it. Expect clear quality-assurance protocols, with all content reviewed by experienced researchers who can verify any questionable insights against source materials.

Systematic implementation: Partners should demonstrate their enhanced capabilities through proven applications rather than experimental approaches. Look for evidence of tested protocols and established processes.



The strategic advantage


When implemented thoughtfully, enhanced research amplifies authentic insights without sacrificing efficiency. Our work with the University of Otago amplified genuine participant voices, accelerated analysis, and made findings strategically accessible to leadership. The outcome? Insights that informed strategic planning and implementation roadmaps were achieved in weeks rather than months. This represents a critical advantage.

This new methodology has expanded what's possible for strategic research, enabling comprehensive stakeholder analysis that was previously resource-prohibitive.

Organisations increasingly need research partners who can demonstrate both technical sophistication and professional diligence. The question isn't whether to use AI-enhanced research capabilities, but how to identify partners who can implement them responsibly while preserving the authentic insights that drive strategic decisions.