The implications of Ben Bernanke’s review of forecasting for monetary policy making at the Bank of England go far beyond the Bank itself. His recommendations lead with a call to accelerate the modernisation of software used to manipulate data as rapidly as feasible. The Bank should certainly expand investments in creating the flexible, scalable and dynamic data platforms necessary as the foundation of effective forecasting. But the same is true for many other organisations across financial services and beyond, and not only within the UK but around the world. As we enter the age of Artificial Intelligence, when organisations of all types will look to deploy AI across their operations, the ability to provide robust, accurate, auditable data at speed and scale to models and humans whenever and wherever they are required will be the defining characteristic of success.
The first of the three themes outlined in the Bernanke report is “Building and Maintaining a high-quality infrastructure for forecasting and analysis.” It specifically makes the point that; “Substantial investment is needed, beyond that already underway, to develop key parts of the data, modelling, forecasting and evaluation infrastructure and the expert staff to support them.” According to Bernanke “[…]deficiencies in the framework, together with a variety of makeshift fixes over the years, have resulted in a complicated and unwieldy system that limits the capacity of the staff to undertake some useful analyses.”
Four Urgent Outcomes
Although the Bank of England, along with many other organisations, is making efforts to upgrade the data infrastructure upon which it relies, as report makes clear, these efforts are not getting the focus and urgency that they require. As Teradata counsels all of its customers, building robust and scalable analytics and data platforms that empower everyone to access unified, harmonised and trusted data, is the essential first step in creating data empowered organisations. Investing in hybrid multi-cloud data and analytics platforms can quickly enable the four outcomes identified by Bernanke as necessary capabilities:
- Ensuring data is comprehensive and clearly defined with sources provided; updated in timely way; and easily searchable.
- Data can be exported and transformed to meet the requirements of routine analyses, with adequate source control (governance).
- Large data sets, both time series and cross-sectional, can be ‘cleaned’ for efficient use in analysis and research.
- Data input to economic and statistical models is automated as much as possible.
A more succinct description of the capabilities of Teradata VantageCloud is hard to envisage. What is implicit in these outcomes, and of significant value, is the need to create data resources that are sharable and can be augmented to meet the requirements of other Bank activities such as regulation and its overall responsibility for managing systemic stability. In short, building the right foundations for better forecasting also establishes a robust framework for data analytics across all the Bank’s functions.
Embed Data Scientists
Interestingly, the report also recommends that “The Bank might consider adding a few data specialists to work with economists in accessing and working with data, especially larger and more complex data sets.” Aside from the alarming inference that the Bank of England does not already have data scientists or data engineers working alongside economists, this comment also has generalised application across many organisations. The importance of developing data-driven cultures supported by a common language of data literacy is something that we have espoused for some time. Data analytics and AI in particular have become the source not only of critical insight, but of competitive advantage and growth. A three-way, ongoing trialogue between business users (be they economists, planners or line of business managers), IT teams and data science specialists, is essential to ensure close alignment between requirements, capabilities, data and IT resources. With all three collaborating data analytics and AI will be much more rapidly deployed to add value.
Uncharted Futures Need Trusted Data
The Bernanke report comes at an important time. Initiated to investigate how the Bank of England’s forecasting could be improved, it followed a time of extreme volatility which challenged many of the established models and approaches the Bank relied upon. Covid, energy and cost of living crises were the immediate catalysts, but the real issue is that this volatility is now part of everyday experience across sectors. It is not only that a series of so-called ‘Black Swan’ events have happened in close succession, but that they are increasing in frequency generally. As Margaret Heffernan argued in her 2020 book Uncharted, planning for an increasingly uncertain future is become increasingly difficult. The ability to readily access trusted data from as wide a range of reliable sources as possible, to feed thousands if not millions of analytical models to provide insight on potential scenarios will be essential. As AI becomes mainstream, knowing exactly what data was used to train a model and on which it makes decisions will be critical if those decisions are to be trusted.
The Time is Now
The pressure is on at the Bank of England to quickly move forward and expand investments in requisite data infrastructure to improve its forecasting. The same is true for many other organisations that have similar ‘complicated and unwieldy’ systems, data silos, legacy technical debt and outdated processes that inhibit the agility and scalability required to thrive in today’s data intensive environment. Getting your data house in order is the essential first step, and the time to do it is now.