James Cloyne, Ryland Thomas and Alex Tuckett.
The financial crisis has thrown up a huge number of empirical challenges for academic and professional economists. The search is on for a framework with a rich enough variety of financial and real variables to examine both the financial shocks that caused the Great Recession and the unconventional policies, such as Quantitative Easing (QE), that were designed to combat it. In a new paper we show how using an older structural econometric modelling approach can be used to provide insights into these questions in ways other models currently cannot. So what are the advantages of going back to an older tradition of modelling?
An ongoing issue for central bank economists is that they typically want to look at a wide range of financial sector variables and at a more granular, sector-based level of aggregation than typically found in macroeconomic models with credit and asset market frictions. For example, we often want to distinguish between the credit provided to firms separately from that provided to households or between secured lending and unsecured lending. We may also want to compare and contrast a number of policy instruments that work through different channels such as central bank asset purchases (QE) and macroprudential tools such as countercyclical capital requirements.
It is a tough challenge to incorporate all of these effects in the theoretical and empirical models that are typically used by macroeconomists, such as structural vector autoregression (SVAR) models and micro-founded general equilibrium (DSGE) models. For these reasons turning back to the older tradition of building structural econometric models (SEMs) – built from blocks of simultaneously estimated equations with structural identifying restrictions – can be useful. This approach can be thought of as a blend of the more theory-free VAR methods and a more structural model-based approach. The main advantage of the structural econometric frameworks are that they produce quantitative results at a sector level, which can still be aggregated up to produce a general equilibrium response. They also allow models to be built up in a modular way that allows replacing and improving sets of equations for particular blocks of the model without necessarily undermining the logic of the model as a whole. This older school approach to modelling has begun to appear in a variety of modern vintages. The GVAR methodology (see Mauro and Pesaran (2013)) also takes the approach of estimating many separate blocks, with linkages between them determined by a weighting matrix. This approach has been used to analyse global shocks (for example Cesa-Bianchi et al (2012)) – with the weighting matrix built from trade relationships.
The gains in tractability, richness and ability to fit a broader set of data are undoubtedly useful for policy analysis. But this at the expense of being more subject to Lucas critique issues and the need to make some stronger identifying assumptions than are typical in the SVAR and DSGE models literature. In particular switching in and out alternative blocks of equations runs the risk of breaking cross equation restrictions implied by the deeper structural parameters affecting the behaviour of households, companies and policymakers. So the model loses some of the internal coherence of a DSGE model. This trade-off has been the subject of much discussion in the modelling community given that the financial crisis has increased the need to incorporate a number of new features into macroeconomic models. As Simon Wren-Lewis has argued, a model that does not satisfy the Lucas critique can give better (albeit not perfectly robust) policy advice, because it is closer to the data.
So what does our model look like and what sort of things can we use it for?
There are two aspects at the core of our modelling strategy. Firstly, lending, money and spending decisions by each sector are modelled jointly, with interest rates and other variables treated as being determined externally to the sector. Secondly, the decisions of these sectors are aggregated up, using the GDP expenditure identity in the case of spending, and the consolidated aggregate balance sheet of UK-resident banks in the case of the lending and money variables (Figure 1).
The variables for each sector are jointly estimated as a vector error correction model (VECMs). The household sector is in turn split into two sub-systems: one which determines households’ money, unsecured borrowing and consumption; and another which determines secured lending, house prices and housing investment. There are also sectoral VECMs for the banking sector and non-intermediate other financial corporations (NIOFCs) which include pension funds and other asset managers. Other elements on the banking system’s balance sheet are currently assumed fixed (the greyed-out elements in Figure 1). But the modular nature of the model will allow us to incorporate these elements over time.
In our paper we show how the model can be used for two purposes:
First we look at the impact of an unanticipated increase in asset purchases by the central bank (QE). We show how the model can be used to analyse the portfolio rebalancing channel of QE as it impacts on the money holdings of non-bank financial companies and the subsequent effect that has on asset prices and aggregate demand and inflation (Chart 1). But we are also able to look at the indirect effects of QE on other parts of the banking system’s balance sheet such as banks’ holdings of government debt and other liquid assets and their issuance of long term debt and equity (non-deposit liabilities) shown in Chart 2. We also explore the channels through which QE might affect bank lending – through boosting the demand for credit by increasing activity and reducing banks’ funding costs – building on work by Nick Butt, Rohan Churm and Michael McMahon.
Chart 1: The impact on GDP and inflation of QE
Second, we use the model look at the impact of the financial crisis on the economy. Here we look at the impact of the decline in spreads that occurred in the immediate lead up to the crisis followed by the effect of the much larger increase in credit spreads faced by the household and corporate sectors as the financial crisis unfolded. We show that the credit supply shock that followed the financial crisis had a peak impact on real GDP of around -5% – around 1/3 of the decline in GDP relative to its pre-crisis trend (Chart 3). The shock reduced both private consumption and investment (Chart 4). Our model also allows us to estimate the effect that the financial crisis had on levels of lending in the economy. This suggests the contraction in credit supply after the crisis can explain around half the deviation in credit volumes, and 2/3 of the fall in the customer funding gap, relative to trend. Interestingly, the impact on lending is more persistent than the impact on GDP; lending volumes continue to stagnate after GDP has begun recovering.
The flexibility of our model also allows us to explore the implications of new and alternative theories and embody some of the latest empirical results emerging from microdata studies. For example:
- Our colleague Michael Kumhof, in joint work with Zoltan Jakab, has shown the importance of banks as creators of money and how shocks may be considerably amplified compared to models that assume banks are pure intermediaries. Our model embodies the principle that loans create deposits and contains the necessary linkages to explore the impact on asset prices and the real economy.
- In a recent paper and Bank Underground post, May Rostom, Jeremy Franklin and Greg Thwaites have highlighted the impact of the availability of bank credit to PNFCs as an important factor explaining investment and productivity during the crisis. Our model can easily accommodate the effect on labour productivity they find. This allows us to look at the potential effect of different credit supply shocks on potential supply.
This paper also discusses how the model could be used to assess other unconventional policy instruments: policies that are intended to reduce bank funding costs such as the Funding for Lending Scheme, regulatory liquidity policy, and some macroprudential tools.
All this suggests to us that there is still a place in the wine cellar for ‘vintage’ structural econometric models and they should brought to the policymaker’s table on appropriate occasions. They allow new issues and questions to be explored empirically before those effects can be grounded more formally in microfounded models. This is important for policymakers who often have to make decisions long before that process has been completed.
 The idea (from Robert Lucas Jnr) that it is difficult to use models based on past data to predict the effects of policy, as behaviour changes as a result of the policy, meaning past relationships are no longer a guide to the future.
 Charts 1 and 2 show the estimated impact of the actual asset purchases – of £375bn in total – that took place between 2009 and 2013. The assumptions behind the impact on Potential supply shown in Chart 1 are discussed in more detail in our Staff Working Paper.
 The difference between loans extended from the domestic banking sector to the real economy, and deposits held by the real economy with the domestic banking sector. By definition, the CFG is the amount of lending that must be financed with ‘wholesale’ funding – from domestic non-bank financial institutions or from overseas.
 For more detail on the Funding for Lending Scheme, see Churm et al (2012).
If you want to get in touch, please email us at email@example.com. You are also welcome to leave a comment below. Comments are moderated and will not appear until they have been approved.
Bank Underground is a blog for Bank of England staff to share views that challenge – or support – prevailing policy orthodoxies. The views expressed here are those of the authors, and are not necessarily those of the Bank of England, or its policy committees.