Tag Archives: big data

Making big data work for economics

Arthur Turrell, Bradley Speigner, James Thurgood, Jyldyz Djumalieva, and David Copple

‘Big Data’ present big opportunities for understanding the economy.  They can be cheaper and more detailed than traditional data sources, and on scales undreamt of by survey designers.  But they can be challenging to use because they rarely adhere to the nice neat classifications used in surveys.  We faced just this challenge when trying to understand the relationship between the efficiency with which job vacancies are filled and output and productivity growth in the UK.  In this post, we describe how we analysed text from 15 million job adverts to glean insights into the UK labour market.

Continue reading

Comments Off on Making big data work for economics

Filed under Macroeconomics, New Methodologies

Using machine learning to understand the mix of jobs in the economy in real-time

Arthur Turrell, Bradley Speigner, James Thurgood, Jyldyz Djumalieva and David Copple

Recently, economists have been discussing, on the one hand, how artificial intelligence (AI) powered by machine learning might increase unemployment, and, on the other, how AI might create new jobs. Either way, the future of work is set to change. We show in recent research how unsupervised machine learning, driven by data, can capture changes in the type of work demanded.

Continue reading

Comments Off on Using machine learning to understand the mix of jobs in the economy in real-time

Filed under Macroeconomics, New Methodologies

Big Data jigsaws for Central Banks – the impact of the Swiss franc de-pegging

Olga Cielinska, Andreas Joseph, Ujwal Shreyas, John Tanner and Michalis Vasios

The Bank of England has now access to transaction-level data in over-the-counter derivatives (OTCD) markets which have been identified to lie at the centre of the Global Financial Crisis (GFC) 2007-2009. With tens of millions of daily transactions, these data catapult central banks and regulators into the realm of big data.  In our recent Financial Stability Paper, we investigate the impact of the de-pegging in the euro-Swiss franc (EURCHF) market by the Swiss National Bank (SNB) in the morning of 15 January 2015. We reconstruct detailed trading and exposure networks between counterparties and show how these can be used to understand unprecedented intraday price movements, changing liquidity conditions and increased levels of market fragmentation over a longer period.

Continue reading

Comments Off on Big Data jigsaws for Central Banks – the impact of the Swiss franc de-pegging

Filed under Currency, Financial Stability, Market Infrastructure, New Methodologies

The U Word: What can text analytics tell us about how far uncertainty has risen during 2016?

Alastair Cunningham, David Bradnum and Alastair Firrell.

Uncertainty is a hot topic for economists at the moment.  Have business leaders become more uncertain as a result of the EU referendum?  If so, has that uncertainty had any effect on their plans?   The Bank’s analysts look at lots of measures of economic uncertainty, from complex financial market metrics to how often newspaper articles mention it.  But few of those measures are sourced directly from the trading businesses up and down the country whose investment and employment plans affect the UK economy.  This blog reports on recent efforts to draw out what the Bank’s wide network of business contacts are telling us about uncertainty – comparing what we’re hearing now to trends seen in recent years.

Continue reading

2 Comments

Filed under Macroeconomics, New Methodologies

Tweets, Runs and the Minnesota Vikings

David Bradnum, Christopher Lovell, Pedro Santos and Nick Vaughan.

Could Twitter help predict a bank run? That was the question a group of us were tasked with answering in the run up to the Scottish independence referendum. To investigate, we built an experimental system in just a few days, to collect and analyse tweets in real time. In the end, fears of a bank run were not realised, so the jury is still out on Twitter. But even so we learnt a lot about social media analysis (and a little about American Football) and argue that text analytics more generally has much potential for central banks.

Continue reading

Comments Off on Tweets, Runs and the Minnesota Vikings

Filed under Financial Stability, New Methodologies

It’s a model – but is it looking good? When banks’ internal models may be more style than substance.

Tobias Neumann.

Most large banks assess the capital they need for regulatory purposes using ‘internal models’.  The idea is that banks are in a better position to judge the risks on their own balance sheets.  But there are two fundamental problems that can arise when it comes to modelling.  The first is complexity.  We live in a complex world, but does that mean a complex model is always the best way of dealing with it? Probably not. The second problem is a lack of ‘events’ (eg defaults).  If we cannot observe an event, it is difficult to model it credibly, so internal models may not work well.

Continue reading

Comments Off on It’s a model – but is it looking good? When banks’ internal models may be more style than substance.

Filed under Banking, Financial Stability, Macroprudential Regulation, Microprudential Regulation