You’re sitting in a boardroom, discussing how your organization can rapidly create new value from existing assets — people, markets, partners, pricing models, products, and even go-to-market strategies. It is a common dilemma — and an opportunity.
But there is one asset often overlooked – your data. An estimated 90% of enterprise data is unstructured. In fact, the majority of that data is likely locked away in tens of thousands of databases, perhaps scattered across multiple geographies, domains, and owners. There is a vast reservoir of untapped, unseen, and underutilized data waiting to be unleashed.
The example of the power of that unstructured and often unseen data is immense. The financial services sector in the UK (12% of UK GDP) probably has the most trapped value in its data systems. Such data is rightly sovereign to the organizations, consumers, and customers. But the power to apply AI and machine learning algorithms to all those datasets could release more GDP growth by 2030, potentially at a rate of 1%–5% a year globally.
There is a formula for not only understanding your data but harnessing its value for real output and AI applications. Doing this well involves total estate observability, agile data infrastructure, a focus on building continuous value moments (versus one-and-done), and a decision to think about data as an intelligent platform for success.
Every day, a new wave of AI technologies promises to solve our old and new data challenges. However, these technologies are still varied in maturity and generally present a steep learning curve for the traditional data team. That said, a unique opportunity exists to leverage proven, reliable, and adaptable technologies and vendors.
PostgreSQL, or Postgres, is perhaps the best technical example of this. An enduring relational database that has reemerged in popularity as the most loved and used database, it is more powerful and relevant in the AI age than ever before. With its robust architecture, native support for multiple data types, and extensibility by design, Postgres is a prime candidate for enterprises looking to harness the value of their data.
It is no surprise that our research shows that over 35% of enterprises are considering Postgres for their next data platform or infrastructure project, especially around AI applications and data.
Our experience with customers in highly demanding and complex sectors shows that unlocking data value can solve four key areas of data sovereignty where enterprise-grade Postgres excels.
To successfully navigate these challenges, organizations should focus on these four key principles:
When 90% of your data is scattered across thousands of databases, geographies, domains, and owners—spanning multiple generations of technologies—the ability to centralize oversight and coordination becomes essential to unlock untapped value and maintain systems that are both healthy and high-performing. While data lakes or estates might connect conceptually through topology or marketecture, they often lack the technical bridges needed to extract or infuse new value.
Imagine holding 10 different currencies, each only valid in its own country. Without a way to exchange them, their potential remains locked in their respective geographies. For data, centralized management and observability act as that exchange system, creating a common denominator that enables you to discover and deliver entirely new value structures.
McKinsey tells us that data-driven organizations are 23 times more likely to acquire customers, 6 times more likely to retain customers, and 19 times more likely to be profitable. Senior executives in charge of mission-critical workloads (OLTP, OLAP, and AI) told us that the ability to infuse AI into these workloads to reveal or create value is enhanced by more than 20% with full estate observability. It is a fundamental need.
Enterprise-grade Postgres as a centralized observability platform uniquely solves for this in three unique ways:
Postgres provides a unified data management layer that seamlessly integrates across on-premises and cloud environments, enabling centralized observability and management.
With various built-in extensions, Postgres delivers real-time monitoring and performance insights that are crucial for optimizing AI workloads.
Postgres’s versatility in handling both structured and unstructured data ensures that every data source is included in observability efforts, regardless of where it resides.
Consumers drive two out of every three dollars in almost every economy worldwide. Most of us have a clear understanding of our bank account balances, the status of our recent online purchases, our streaming habits, and our healthcare or mortgage payments. We know this personal data is, well, personal, but it also needs to be generally accessible anytime, anywhere to ensure a seamless experience. And it should always be secure. This same principle should apply to business data.
Yet much of a business’s data potential is trapped—locked in various clouds, scattered across regions, and sometimes still on-premises. Freeing this sovereign data so that it’s secure yet available at all times should feel as seamless as logging into your Netflix account to view your watch history or checking your physical wallet for a receipt.
Your data isn’t confined to one format or location, accessible only through a single interface. It exists on-premises (the physical wallet) and across multiple clouds (as with Netflix or a global bank), and it spans geographies where it must perform reliably.
Enterprise-grade Postgres enables you to deliver that sovereign power in three ways:
It offers full flexibility in deployment across multiple environments, from on-premises to private and public clouds.
Its advanced security features, such as role-based access control, encrypted connections, and built-in auditing, ensure that your data remains compliant with industry and regional regulations.
Postgres also enables seamless data replication and synchronization, making data available across geographies while maintaining security and control.
The power of dynamic pricing, personalized customer support, and real-time supply chain management lies in their immediacy. A hotel front desk manager can adjust room pricing in real time — and now imagine every manager at 1,000+ locations doing the same, instantly responding to demand.
This level of real-time responsiveness creates significant value for the business, turning a room that would lose all value after 11 pm into an opportunity for last-minute revenue. It also enhances the customer experience by enabling the front desk clerk to accommodate personal preferences or upsell, based on near-instant understanding of customer data.
The idea of personalization holds enormous potential. McKinsey estimates a $1 trillion opportunity, while Deloitte’s research shows that consumers are willing to pay 20% more for personalized services. In industries such as hospitality, dynamic, real-time value creation — powered by AI and data — may be a key path to profitability growth.
Now consider the value of this real-time access to data in banking, where portfolio managers make real-time decisions for individual clients; and in healthcare, where nurse practitioners and doctors can access up-to-the-minute patient data. Organizations across countless industries will benefit from empowering frontline workers to make informed, real-time decisions.
To enable this, data must flow seamlessly across domains, and real-time models need to interact with real-time updates as they are generated.
Value creation also includes the ability to detect and mitigate risks swiftly. This requires strong data governance: proper authorization controls, robust security layers, and advanced encryption protocols must be in place to ensure that risks are arbitraged in real time while keeping the system secure.
Enterprise-grade Postgres enables real-time data value creation:
With its high-performance architecture, Postgres supports complex transactions and near–real-time analytics.
Its native support for multi-modal workloads and inherent extensibility to fulfill use cases reaches beyond traditional operational workloads,
Integration of AI models directly with the database enables real-time personalization and decision-making at scale.
Sixty percent of major enterprises in the U.S. and major EMEA economies are infusing their own AI into enormous data workloads. A significant portion of the future value of business data will come from the ability to access it anytime, anywhere, while maintaining compliance.
And AI’s power should only be constrained by industry standards and regional regulations, never by the limitations of your data infrastructure. This means your data requires a new governance framework and will need to exist across multiple environments — on-premises, in private clouds, and potentially in public clouds — ensuring both flexibility and compliance at every level.
Enterprise-grade Postgres serves as a foundational data platform:
Its ability to handle massive datasets across distributed environments ensures that data remains accessible, compliant, and ready for AI integration.
Postgres optimizes performance for large-scale data operations, ensuring that organizations can harness AI for real-time competitive advantage.
Data and AI are at the heart of economic power and personalization, as seen in organizations including Amazon, Tesla, Netflix, and leading banks. Building your own sovereign AI and data platform should be an essential strategy that drives the modernization of both your data infrastructure and your business practices.
This is that rare moment when business needs, infrastructure needs, and the raw power of AI and data should converge into one secure, compliant, and near-instantly productive platform for transformational success.
About the Author:
Jozef de Vries is Chief Product Engineering Officer at EDB. de Vries leads product development for EDB across the on-premise and cloud portfolios. Prior to joining EDB, he spent nearly 15 years at IBM across a number of roles. Most recently he built and led the IBM Cloud Database development organization via organic growth, mergers, and acquisitions.