The real challenge isn’t just about blocking or safeguarding against AI crawlers. It’s making sure the right information surfaces with integrity. Investors and advisors should see content that’s accurate, trustworthy, and easy to find when AI systems answer their questions.
Sunil Odedra - Chief Technology Officer, Kurtosys
In the lead-up to TSAM New York on October 9, our Client Relationship Manager, Brad Burgunder, sat down with Sunil Odedra (Chief Technology Officer) and Michelle Wright (Chief Product Officer), to explore a question that is becoming increasingly relevant across the industry: who is really visiting your website?
Are they genuine visitors, or is it AI? And if AI traffic is on the rise, how should firms prepare, ensuring their websites remain resilient and their content stays visible?
Sunil addressed this subject at TSAM London earlier this year, and since then the landscape has continued to evolve. In this conversation, he shares his perspective on those changes and Michelle also offers her insights on GEO, ahead of her upcoming fireside chat at TSAM New York, where she will delve deeper into these themes and more.
Watch the full interview here:
What has changed since April?
The pace of change in this space has been remarkable. According to Sunil, in the past 6 months, three key shifts have emerged:
- Paywalls for bots are being tested
Major content providers are beginning to set clearer standards for AI access. Some platforms are experimenting with mechanisms that require AI crawlers to either accept terms of use or even pay before accessing content. This signals a shift from a simple “block or allow” mindset toward a more structured “rules of engagement”.
- Licensing signals are converging
Industry efforts are underway to create a straightforward, machine-readable way for websites to specify how their content may be used by AI systems – independent of any paywall or charging model. This represents an important step toward transparency and standardisation. - “Stealth crawling” is on the rise
Recent headlines have exposed that some AI services are crawling sites without proper identification or without respecting “do not crawl” instructions. This worrying trend underscores the need for firms to move beyond relying on goodwill alone. Clear policies and active monitoring, particularly for unusual traffic spikes, are becoming essential.
What can teams do now without turning it into an IT project?
Let’s look at some practical steps for content owners.
Sunil notes that preparing for AI-driven traffic doesn’t require a complete overhaul as many of the steps are consistent with existing best practices for indexing.
- Remove unnecessary barriers: Pop-ups and gated content can prevent crawlers from accessing key information.
- Reconsider PDF-heavy strategies: Content locked in PDFs may be overlooked, as crawlers often cannot index deeply within these files. Moving important information to landing pages ensures it is both visible and accessible.
- Optimise for AI-style queries: Review headings and on-page content with an eye on how users phrase questions through AI assistants. Simple adjustments in wording can significantly improve discoverability.
Blocking all AI traffic can limit visibility, while opening everything up may lead to loss of control. A balanced approach works best: decide which sections should be accessible, keep information accurate and clear, and monitor for unusual traffic spikes that may indicate automated activity.
Generative Engine Optimisation is not a one-off project – it’s a new habit. Firms that structure content clearly and make it machine-readable will be the ones investors can actually find in the age of AI assistants.
Michelle Wright - Chief Product Officer, Kurtosys
Read more about optimising for GEO here:
AI-driven traffic is still a relatively new concept for the industry. Only a few months ago, terms such as Generative Engine Optimisation (GEO) were unfamiliar to many. As firms continue to refine how they detect, filter, and measure engagement, we can expect a considerable amount of trial and error. While best practices are beginning to emerge, they will only take hold with broad industry buy-in.
Looking ahead, Michelle foresees the development of more widely accepted “playbooks” that will provide frameworks to help firms evolve in step with advances in AI technology.
At TSAM New York, Michelle will be hosting a fireside chat on the theme: “Beyond SEO: Generative Engine Optimisation (GEO) for Asset Managers.” The discussion will highlight:
- Why understanding investor behaviour is critical, as audiences increasingly seek direct, concise answers rather than scrolling through multiple links.
- How structuring content in a useful, clear, and machine-readable way is becoming central to visibility.
- Practical steps to make GEO part of an ongoing process (a habit, not a project).
Kurtosys will be hosting the Marketing and Client Services stream at TSAM New York, taking place on 9 October. Catch the team at the booth or attend our stream. With panel discussions and fireside chats throughout the day, we will be talking to experts about key topics and insights facing the industry.