Login

Addressing data infrastructure when scaling institutional reporting

Listen to the audio transcript here:

More money, more problems: part 2

In our latest blog series, we explore the challenges that come with a growing institutional asset base, particularly the increasing pressure on reporting. In this article, we deal with data infrastructure.

This area has advanced considerably, and getting your infrastructure right can make a big difference to reporting. It can reduce costs, speed up delivery and remove many common data issues altogether.

The traditional setup: brittle, fragmented and error-prone

For many managers, the current model involves moving data from the book of record (the “golden source”) into the reporting system. This usually happens in one of two ways:
  • File transfers: A scheduled script extracts data from the source system and drops it onto a shared server. Another system then picks it up and loads it into the reporting environment.
  • APIs (Application Programming Interfaces): APIs allow systems to connect directly. One system pushes data to another, often on a schedule.

These setups are functional but flawed. They create multiple copies of your data, which increases the risk of errors. If a number changes at the source but doesn’t get updated everywhere else, reports can quickly become outdated or worse, incorrect. This is not only a serious compliance risk, it also undermines your credibility.

With multiple integrations running between internal systems, websites, client reports and partner platforms, it’s hard to keep everything aligned. And when something breaks, it’s rarely a quick fix.

Snowflake

The modern alternative: Cloud data warehouses

Cloud data warehouses, led by platforms like Snowflake, Databricks and Google BigQuery, have changed the way firms manage and share data. These platforms offer real-time access to centralised data, meaning there’s no need to create and maintain duplicate versions across systems.

Instead of lifting and shifting data around your infrastructure, these warehouses allow you to give other systems secure, direct access. This is often referred to as zero-copy data sharing.

Here’s how they help:

  1. Stop duplicating your data
    With zero-copy access, your reporting tools can query live data directly from the warehouse. There’s no need to create daily exports or move data files around. This ensures your reports are always based on the latest available information and dramatically reduces the risk of data mismatches.

  2. Improve collaboration across teams and providers
    Let’s say your ESG team maintains environmental and sustainability data, sourced from providers like TruCost, Sustainalytics or MSCI. This ESG data is needed by multiple teams, including product, reporting and client services. With live data sharing, each team can access the same governed dataset without creating their own version.

    This also applies to external parties. With secure access controls, you can share specific datasets with partners or clients in real time without emailing files or exposing unnecessary information.

What to keep in mind before making the switch

Moving to a cloud data warehouse brings huge benefits, but it isn’t a silver bullet. Here are some important considerations:
  1. Know what it is and what it’s not Cloud data warehouses are not enterprise data management (EDM) systems. They won’t clean or enrich your data. They don’t replace an ABOR (Accounting Book of Record) or PBOR (Performance Book of Record). What they do provide is a more efficient way to connect systems and eliminate clunky, manual data movement.If your foundation isn’t solid, you’ll need to address that first.
  2. Check vendor compatibility Before investing, check whether your downstream systems, including reporting tools, portals and other platforms, can connect to the warehouse or support live data sharing. If they can’t, it may be time to rethink your vendor relationships.
  3. Understand the cost-performance balance Cloud data warehouses charge based on compute power, not just storage. Faster queries cost more. If your reports require high performance, you’ll need to budget accordingly. Most firms can manage costs with clear expectations and planning.
  4. Use sleep mode to save money These platforms often support auto-suspend or sleep mode features. Because reporting workloads tend to run in short bursts, you can avoid paying for idle time. It’s a simple way to keep costs down without sacrificing performance.
  5. Get security right from the start Data sharing doesn’t mean less control. In many ways, it gives you more. You can monitor access, set granular permissions and revoke access at any time. But you need strong governance and should apply the principle of least privilege. Only give users access to the data they absolutely need

Why this matters

The shift to modern data infrastructure isn’t just about better reporting. It’s about reducing operational risk, streamlining processes and building a more flexible foundation for growth.

Many forward-thinking managers are already offering clients live access to reporting dashboards powered directly by these warehouses. Others are starting to support real-time data feeds into investor portals and consultant platforms.

It’s not just about doing things faster. It’s about doing them smarter, more securely and at scale. A great return on your investment.

In case you missed it, part 1 of this series Tackling data chaos when scaling institutional reporting looks at where many of these infrastructure challenges begin.

More money, more problems: Part 1

FAQ’s:

  1. What’s the issue with the traditional way data is managed for reporting? Traditional setups often involve copying and moving data between systems. This makes things slow, messy and more likely to go wrong. It’s hard to keep everything up to date, and even harder to fix when something breaks.
  2. How do firms usually move data from source systems into reporting tools? Most use file transfers or scheduled API connections. These methods work but they create lots of data copies and make it difficult to manage changes. A small update in one place might not get picked up elsewhere, which leads to errors.
  3. How do cloud data warehouses make reporting better? They let you access your data in one place, in real time. That means no more daily exports or manual uploads. You can build reports that are always up to date, without the risk of version issues or delays.
  4. What is zero-copy data sharing? Zero-copy means your tools can access data directly from the source, without making a copy first. This keeps everything consistent and reduces errors in reports or dashboards.
  5. Can different teams use the same data without problems? Yes. With the right access controls, everyone can use the same data source. So your ESG, product, reporting and client teams are always working with the latest, most accurate information.
  6. Is it safe to share data with clients or external partners? Yes. You can control exactly who sees what, down to a specific dataset. No need to email files or share folders. It’s faster, safer and more secure.
  7. Does this replace our existing data systems? No. A cloud data warehouse doesn’t clean your data or replace your ABOR or PBOR. It simply helps you move data more efficiently and connect your systems in a smarter way.
  8. What should we check before making the switch? Make sure your other systems, like reporting platforms and portals, can connect to a data warehouse. If they can’t, you may need to review your current setup or speak to your vendors.
  9. Will this cost more to run? You only pay for what you use. If you plan your usage and take advantage of features like auto-suspend, you can keep costs low while still getting the performance you need.
  10. How does modern infrastructure help reduce risk? It removes the need for manual processes and messy data transfers. That means fewer errors, better control over your data and a stronger foundation for growth.
Facebook
X
LinkedIn

Insights from our blog