Financial Risk & Network Theory

In collaboration with Journal of Network Theory in Finance

Financial institutions and markets are highly interconnected, but literature has only recently begun to emerge that maps these interconnections and assesses their impact on financial risks and returns. The conference inaugurates the new Journal of Network Theory in Finance and aims to spark interest and feed new collaborations in this emerging multidisciplinary field.

About the “Journal of Network Theory in Finance”

The Journal of Network Theory in Finance (JNTF) is an interdisciplinary journal publishing academically rigorous and practitioner-focused research on the application of network theory in finance and related fields, bringing together research carried out in disparate areas within academia and other research institutions by policymakers and industry practitioners. 

Download the Journal of Network Theory in Finance, 1 (1) (pdf, 5.4MB)

Visit the Journal of Network Theory in Finance website 

Contact Guilio Giulia Modeo at Incisive Media with enquires

Research areas

Topics include but are not limited to the following:

  • Empirical network analysis that enables better understanding of financial flows, exposures and markets
  • Modelling and simulation techniques for measuring interdependent financial risks
  • New metrics and techniques for identifying central, vulnerable or systemically important institutions and markets in financial networks and informs regulatory strategies for improving financial stability
  • Network modelling of time-series data for financial risk management, asset allocation and portfolio management
  • Social network analysis (SNA) in finance, e.g. for making credit and investment decisions
  • Applied network visualisation techniques that improve the communication of financial risks and rewards
  • Analysis of counterparties and their risk exposure from interconnectivity with the financial system.

Download the full programme

Keynote session one

Daniel Ralph

Academic Director, Centre for Risk Studies

Professor Daniel Ralph is a Founder and Director of the Centre for Risk Studies, Professor of Operations Research at Cambridge Judge Business School, and a Fellow of Churchill College. Daniel received his PhD in 1990 from the University of Wisconsin Madison. He was a faculty member of the Mathematics & Statistics Department at the University of Melbourne before coming to University of Cambridge for a joint appointment in the Engineering Department and Cambridge Judge Business School. Daniel’s research interests include optimisation methods, equilibrium models for electricity markets, and risk in business decision making. He is Editor-in-Chief of Mathematical Programming (Series B).

Welcome & Agenda

Download slides

Dr Kimmo Soramäki

Founder & CEO, Financial Network Analytics

Kimmo Soramäki is the Founder and CEO of Financial Network Analytics and Editor-in-Chief of the Journal of Network Theory in Finance. Before founding FNA in 2010, he worked for 15 years in policy-making, advisory and multidisciplinary research roles at several central banks – including the European Central Bank and the Federal Reserve Bank of New York. His research has focused on the interconnectedness of financial systems and on systemic risk. He has published in academic journals in the areas of economics, statistical mechanics and operations research and is a frequent speaker at industry and academic conferences. Kimmo holds a Doctor of Science in Operations Research and a Master of Science in Finance.

Applications of Network Theory in Finance

Download slides

Rosario Mantegna

Professor, Central European University and Palermo University

Professor Mantegna received his PhD in physics from Palermo University in 1990. He is professor at Central European University in a joint appointment of the Department of Economics and the Center of Network Science and at Palermo University, where he founded the Observatory of Complex Systems. His research concerns interdisciplinary applications of statistical physics with major emphasis on topics of finance and social sciences. He started to work in the area of the analysis and modeling of social and economic systems with tools and concepts of statistical physics as early as 1990 and he is one of the pioneers in the field of econophysics. He published the first paper and coauthored the first book on econophysics.

Similarity-based & Statistically Validated Networks in Finance

Download slides

Rod Garratt

Vice President, Federal Reserve Bank of New York

Rod Garratt is a theorist who specialises in applied game theory. He is known for his work on strategic incentives in payment systems. He has also published theoretical and experimental work related to auctions and helped develop a growing literature on the role of speculators in auctions with resale. He pioneered the use of information-theoretic clustering techniques to describe behavior and identify risks in complex financial networks. Garratt received his PhD from Cornell University before spending over 20 years as a faculty member in the Department of Economics at the University of California, Santa Barbara. He has published in the top economics journals including Econometrica, the American Economic Review and the Journal of Political Economy. He has held visiting positions at UCLA, the University of Arizona and the Bank of England.

Mapping Change in Complex Financial Markets

Download slides

Keynote session two

Iman van Lelyveld

Deputy Head of the International Data Hub, Bank for International Settlements

Iman van Lelyveld is the Deputy Head of the International Data Hub, Bank for International Settlements. Currently he is contributing to the collection and analysis of risk exposure data of large, systemically important banks (GSIBs). He studied macroeconomics at the University of Amsterdam. After a year as foreign exchange trader at Deutsche Bank de Bary, he started at Radboud University working on a PhD project ‘Inflation, Institutions, and Preferences’, which was completed in 1999. At De Nederlandsche Bank (Directorate of Supervision) he has covered subjects such as interest rate risk in the banking book, economic capital (primarily for financial conglomerates) and the Basle Accord (in particular the Supervisory Review). He has published widely, is a member of the BCBS Research Task Force and has chaired international research groups on stress testing and liquidity. He has held a part-time Associate Professorship at Radboud University and has been an advisor to the Bank of England and Norges Bank. His research interests include empirical network models, see links to published papers. 

Networks, Data & Policy

Download slides

Alan Laubsch

Director and Head of Risk Products, Financial Network Analytics

Alan has 20 years of risk management experience and has advised major global banks, asset managers, and sovereign institutions on market and credit risk. A co-founder of the RiskMetrics Group, his expertise encompasses next-generation risk management practices including early warning methodologies and stress testing. A former VP at JPMorgan’s Risk Advisory Group in New York, he joined that firm in 1993 after receiving a BS in Industrial Engineering from Stanford University. Active in the global risk community, Alan is a frequent speaker at industry and banking forums and recently launched an online “Advanced Stress Testing” course with PRMIA. Alan also contributes to high-profile financial publications such as the Asia Wall Street Journal.

This essay examines lessons from systemic breakdowns, and presents a framework for Adaptive Stress Testing to proactively manage systemic risks. The framework is inspired by evolutionary ecosystems, including ecology, economics, technology, psychology, and sociology. Adaptive Stress Testing harnesses network intelligence to integrate early warning signals. We pre-diagnose systemic fragilities by tapping into the marketplace of ideas, and then identify key metrics to monitor market-based early warning signals. We apply the Technology Adoption Lifecycle model to develop a theory of social diffusion of disruptive information in financial markets. We start by taking a macro view of risk in its hidden potential form, and then focus on phase transition signals as risk becomes visible. This process allows us to better understand key systemic risks, and to more effectively sense and respond to emerging risks.

Download slides

Dr Andrew Coburn

Director of External Advisory Board, Centre for Risk Studies

Andrew Coburn is the principal investigator on the research track of understanding financial catastrophe at the Cambridge Centre for Risk Studies, and the convenor of the workshop. Andrew is one of the leading contributors to the creation of the class of catastrophe models that over the past 20 years has come to be an accepted part both of business management in financial services and of public policy making for societal risk. He has extensive experience in developing models and using them for business decision support. Dr Andrew Coburn is a member of the senior management of Risk Management Solutions, the leading provider of catastrophe risk models to the insurance industry.

Financial Catastrophe Risk Modelling

Download slides

Session three: Systematic risk

Goetz von Peter

Senior Economist, Bank for International Settlements

Goetz holds a PhD in economics from Columbia University, and an MSc and BSc from the London School of Economics. Since joining the BIS in 2004, Goetz von Peter has worked as an economist in the various research units of the Monetary and Economic Department. He has written on financial crises, international banking, interbank markets and network analysis, as well as natural catastrophes and global (re)insurance. His research appeared in the Journal of Financial Intermediation, Journal of Banking and Finance, International Finance and the Journal of Economic Literature, among others. Goetz is also a regular contributor to the BIS Quarterly Review, with a focus on current financial market developments and the BIS international banking statistics. From 2009 to 2011 he served the Committee on the Global Financial System (CGFS) and coordinated various working groups in that capacity.

Abstract

The network pattern of financial linkages is important in many areas of banking and finance. Yet bilateral linkages are often unobserved, and maximum entropy serves as the leading method for estimating counterparty exposures. This paper proposes an efficient alternative that combines information-theoretic arguments with economic incentives to produce more realistic interbank networks that preserve important characteristics of the original interbank market. The method loads the most probable links with the largest exposures consistent with the total lending and borrowing of each bank, yielding networks with minimum density. When used in a stress-testing context, the minimum-density solution overestimates contagion, whereas maximum entropy underestimates it. Using the two benchmarks side by side defines a useful range that bounds the cost of contagion in the true interbank network when counterparty exposures are unknown.

Paper co-authored by:

  • Kartik Anand, Financial Stability Department, Bank of Canada
  • Ben Craig, Deutsche Bundesbank and Federal Reserve Bank of Cleveland
  • Goetz von Peter, Bank for International Settlements

Download slides

Ivan Alves

Principal Financial Stability Expert, European Central Bank (ECB)

With a broad academic and professional experience in international finance and central banking, Ivan Alves is Principal Financial Stability Expert at the Macro Prudential Policy and Financial Stability Directorate General of the European Central Bank (ECB), having held prior appointments in Research, Market Operations Analysis and International Relations. Prior to joining the ECB in September 2000, Mr Alves held positions at the World Trade Organisation (Geneva) and the European Statistical Office (Luxembourg) and was a lecturer at the University of British Columbia (Canada). Mr Alves holds a PhD and Master of Arts degrees in Economics awarded by the University of British Columbia and Bachelor of Science degrees in Economics and in Political Science awarded by the Massachusetts Institute of Technology (USA).

Abstract

European banks’ cross exposures via marketable securities are largely visible via collateral supporting ECB operations. The network thus inferred is referred to as the interbank securities network. Owing to the geographical comprehensiveness (all banks active in the euro area), detail on the type of cross-bank exposures and high frequency of the information, the analysis of structural developments in euro area banking is possible, in particular facilitating information for policy decisions. For this purpose a number of system-wide metrics are developed and their usefulness for policy making explained. Notably, information available since 2009 suggests that integration in the interbank system has been faltering, and that the interbank structure is increasingly organised in national components.

Download slides

Sriram Rajan

Senior Researcher, Office of Financial Research / Treasury

Sriram Rajan is a senior researcher at the Office of Financial Research. Mr Rajan is focused on credit default swap markets, with research interests in central clearing, counterparty risk, and risk visualisation. Prior to joining the OFR he served as a consultant in the European Union emissions trading markets, worked in structured credit derivative transactions at Bear Stearns, Inc. Mr Rajan holds an MS in Computational Finance from Carnegie Mellon University and a BS from Duke University.

Abstract

Intro/Problem

Partially as a result of recent regulatory policy, financial regulators1 have been tasked with increased levels of market surveillance and oversight, aimed at ensuring healthy market liquidity and identifying potentially destabilising risks. Included within this mandate is promoting financial stability, mitigating risks which may be of a nature to harm the market as a whole. One of the more prominent areas of concern during the 2008-2009 financial crisis was swaps linked to credit risks, including credit default swap (CDS) contracts. These swaps can be especially difficult to monitor given that they embed multiple risk dimensions, including risks to trade counterparty and to reference entity. Because of a general lack of both regulatory and public data prior to the crisis, international policymakers have called for increased data transparency, and, subsequently, effective methods to use this data for robust financial oversight.

Literature/ Background Risk

Regulatory literature over the last few years has expanded its use of network analysis, and its associated metrics, in order to assess the means by which a bank engages in risk transfer not just between itself and counterparties, but how it fits within the risk transfer network as a whole. Some examples of this use of network tools include the Fed’s monitoring of credit risks passed between the bank dealer network, an IMF working paper on systemic risk as implied by FDIC swap exposures, and the Bank of England’s work on cross-border balance sheet exposures. However, many commonly used network metrics, such as measures of centrality and aggregated accounts of risk flows, are not always well calibrated to the specifics of a given financial market (e.g. jump-to-default risks in credit swaps are not shared by products like interest rate or FX swaps). Results or risk rankings generated by network metrics often only portray one of a number of concerns that may arise from bank activity. In their most general forms, these measures typically focus on a single risk dimension, such as risk to market movements or risk to market volatility, rather than on the relationships between risk dimensions. As seen recently, these relationships can often have the most prominent effects in the area of system stability2.

Visual Analysis

One approach in addressing some of the oversight gaps resulting from a reliance on metrics alone has been to integrate then with visual analytics. This technique merges analytical reasoning with interactive visual interfaces. A key advantage of using visual techniques is the transformation of multidimensional data into a visual representation which can clarifies important relationships inherent to the network structure and to the idiosyncratic characteristics of nodes/edges. When done well, Perer and Shneidermen (2008) have suggested that integration of these tools can dramatically speed up insight for visualisation users, in our case regulators and policy-makers. For the case study included in this paper, risk transfers effected by individual financial entities along with entity classes (at a determined aggregate level), will be matched with network measures which proxy for entity importance to the network on a global level.

CDS Market

CDS instruments provide insurance against the default of a given institution (commonly known as the reference entity), and require regular payment streams from the counterparty purchasing protection. After an event of company default, where the company cannot fully satisfy it debt obligations, the CDS seller agrees to make the purchaser “whole,” paying against losses accrued on a given set of bonds or the equivalent bond notional. Historically, most of these transactions have been bilateral, with both counterparties having present and/or future, state-contingent, payment obligations. This system of relationships is one of the largest examples of credit intermediation within the financial world3, and takes three primary forms: the assumption of credit risk by dealers from their clients, the distribution of risks between dealers, and the reallocation by dealers of risks back to their clients. In the visualisations developed in the paper, the focus will be on interdealer trading, as reported to data repositories4, including how client risks get intermediated between dealers.

Hive Plot

Given that CDS contracts have historically relied on bilateral relationships, network visualisations, when used, have typically focused on how trade relationships and exposures distribute risk by way of ones counterparties (see Yellen, 2013 and Haldane, 2009). Corresponding network measures often then capture, quantitatively, the risks depicted in the associated network diagrams. These measures encapsulate individual forms of risk transfer such as the most prominent vectors of risk between entities (betweenness centrality), entities with greatest distribution of risk (closeness centrality) or even measures of an entity’s importance to risk at a global level (eigenvector centrality). Credit risk summaries at the level of a node, on the other hand, depend critically on a counterparty’s concentration and default correlation to a specific reference entity, risk that is sourced by the product itself. Though market participants rely on contractual valuation models that take into account the joint risk of counterparty and credit risk default, often with difficulty5, this problem is even more acute for regulators who must be able to grasp these joint exposures not just for a single financial firm, but for firms which differ by size, geographical reach and business practices. Establishing a concise method of depicting these variations across firms, while also prioritising risk as it may affect financial stability, is a crucial product of regulatory oversight.

In this paper, we propose a less-commonly used graphical representation, the hive plot (Krzywinski, 2012), matched with numerical summaries, as one way of achieving this efficient rendering. Figure # provides a sample hive plot, where axes consolidate entities by business practices (dealers versus clients) and by counterparty risk versus product risk (the latter by reference entity axis). The individual nodes and edges in the chart display the relationship between counterparties, and can be filtered by a selection of reference entities. To clarify with an example, a buy-side institution (for example, a hedge fund) may purchase CDS protection from a dealer, who then offsets this position through a trade with a second dealer. Both dealers are shown twice, to allow for explicit depiction of the interdealer trade. Finally, the second dealer “closes the loop” by buying protection from a second end user outside the interdealer core of the market (perhaps an asset manager). Detailed depictions like this provide valuable context for an analyst to understand market behaviour and evaluate risks, namely counter party and reference entity, in unison.

Applications

In our paper we will outline how network risks of multiple types (risks within the inter-dealer network, risks between dealers and their clients, risks due to reference entity characteristics) evolve during crisis events, during natural periods of risk re-allocation (such as the credit index roll) and how margin and clearing may mitigate some of these concerns.

Conclusions

Our paper suggests that the use of visual tools can play a significant role when integrated with traditional network analytical techniques; this integration is especially powerful when a network visualisation is especially attuned to the idiosyncratic characteristics of a market structure – in our case the hive plot and bilateral CDS relationships. This visual technique allows for the integration of several risk dimensions, such as distinguishing between type and importance of counterparty, along with the importance of product risk, and therefore clarifies financial stability weaknesses. In order to demonstrate the value and potential uses of these visual techniques, we have shown how application to the bilateral CDS market, especially during periods of market instability, can focus regulators on the sources and vectors of potentially cascading failures, along with ways of mitigating these concerns.


1 By “regulator” we implicitly encompass a number of different parties, the most obvious being that of public government market regulators like the SEC, CFTC or FSA. In addition, however, this could include SRO’s such as FINRA or NFA within the US, and exchanges tasked with market oversight.

2 For instance, risk associated to AIG’s CDS position was exacerbated by the positive relationship between negative market movements in its CDS sales and its own credit-worthiness, requiring unfulfillable margin calls by its counterparties.

3 The most recent estimate of global credit derivatives notional exposure (H1 2013) by the Bank of International Settlements is approximately $24tr.

4 The data is sourced from the largest credit swap data repository during the period of study, the Depository Trust and Clearing Corporation (DTCC).

5 See the controversies regarding the use of Gaussian copulas to estimate correlation risks in credit markets.

Paper co-authored by:

  • Sriram Rajan, Office of Financial Research
  • Richard Haynes, US Treasury Department
  • Mark Paddrik, Office of Financial Research

Download slides

Falk Bräuning

PhD Candidate, VU University Amsterdam/Tinbergen Institute

Falk Bräuning is a PhD student in the department of econometrics at VU University Amsterdam and Tinbergen Institute. His research interests are in the empirical analysis of money markets and the econometric analysis of financial and economic networks. More broadly he is interested in macro-finance, monetary policy and econometric methodology. Prior to joining the VU University Amsterdam, Falk obtained a Diplom in Economics from Mannheim University and an MPhil in Economics with a specialisation in econometrics from Tinbergen Institute.

Abstract

This paper introduces a structural micro-founded dynamic stochastic network model for the unsecured interbank lending market. Banks are profit optimising agents subject to random liquidity shocks and can engage in costly counterparty search to find suitable trading partners and peer monitoring to reduce counterparty risk uncertainty. The structural parameters are estimated by indirect inference using appropriate network statistics of the Dutch interbank market. The estimated model is shown to explain accurately the high sparsity and stability of the lending network. In particular, monitoring of counterparty risk and directed search are shown to be key factors in the formation of interbank trading relationships that are associated with improved credit conditions. Moreover, the estimated degree distribution is highly skewed with few highly interconnected core banks and peripheral banks that trade mainly with core banks. Shocks to credit risk uncertainty can lead to extended periods of decreased market activity, but reactions are heterogeneous across banks. Finally, we analyse optimal network responses to changes in the central bank’s discount window.

Paper co-authored by:

  • Francisco Blasques, Department of Econometrics and Finance, VU University Amsterdam, Tinbergen Institute
  • Falk Bräuning, Department of Econometrics, VU University Amsterdam, Tinbergen Institute
  • Iman van Lelyveld, De Nederlandsche Bank

Download slides

Iñaki Aldasoro

PhD Candidate, Goethe University Frankfurt

Iñaki Aldasoro is a PhD candidate at Goethe University Frankfurt and a researcher in the macro-finance area at the Research Center SAFE, with a focus on financial stability, interbank networks and systemic importance. He also visited the International Monetary Fund and worked as consultant at the European Central Bank. Before coming to Frankfurt he worked in the Brussels-based think tank Bruegel, studied at the Kiel Institute for the World Economy and worked in economic consulting in Buenos Aires, where he also studied economics.

Abstract

We present a network model of the interbank market in which optimising risk averse banks lend to each other and invest in liquid and non-liquid assets. Banks can trade on both sides of the interbank market. Clearing in the interbank and in the non-liquid asset markets takes place through a price tâtonnement mechanism. In order to match traded quantities we consider three alternative algorithms: Maximum Entropy, Closest matching and Random matching. We analyse the resulting network configurations using various network centrality, input-output and systemic risk (Shapley value) metrics. The interbank network generated by the model replicates several features of empirical interbank networks. We exploit our model to assess the performance of two prudential policies (liquidity and equity requirements) on the stability and the efficiency of the system (the latter measured by overall investment). Overall we find that liquidity requirements unequivocally increase stability but reduce efficiency, while equity requirements tend to increase stability without reducing significantly overall investment.

Paper co-authored by:

  • Iñaki Aldasoro, Goethe University Frankfurt & SAFE
  • Domenico Delli Gatti, Catholic University of Milan
  • Ester Faia, Goethe University Frankfurt, CFS & SAFE

Download slides

Andreas Krause

Assistant Professor in Economics, University of Bath

Andreas Krause is Assistant Professor in Economics at the University of Bath. He has research interests particularly in systemic risk of banking systems, where he focuses mainly on the network structure of interbank exposures and how failures of banks spread through such networks. In general he is interested in heterogeneous agents interacting through networks and how such interactions shape aggregate outcomes, such as stock prices or analyst recommendations. Previous work has included an analysis of credit card markets using a network approach and questions of optimal market design. His work has been published in the Journal of Economic Behavior and Organization, IEEE Transactions on Systems Management and Cybernetics Part A – Systems and Humans, Physica A, Quarterly Review of Economics and Finance, Intelligent Systems in Accounting, Finance and Management and many more.

Abstract

We analyse the impact minimal capital and reserve requirements have on bank failures arising from solvency and liquidity shortages in a banking system where banks are characterised by the amount of capital, cash reserves and their exposure to the interbank loan market as borrowers as well as lenders. A network of interbank lending is established that is used as a transmission mechanism for the failure of banks through the system. We find that the impact of minimum capital and reserve requirements is small and excess holdings work to a similar degree, suggesting that targeting capital requirements more carefully on specific banks can be more effective than common minimum requirements for all banks.

Paper co-authored by:

  • Andreas Krause, School of Management, University of Bath
  • Simone Giansante, School of Management, University of Bath

Download slides

Session five: SNA in finance

Dan Evans

Senior Researcher, Network Science Center, West Point

Dan Evans is a Senior Researcher at the Network Science Center at West Point, an innovative research centre that develops advances in the study of network representations of physical, biological, and social phenomena. Dan has an extensive background modeling economic networks with a focus on emerging markets especially on the African continent. He is a graduate of the US Military Academy, West Point and received his MBA from the College of William and Mary. He served as an Infantry Officer, Operations Research Analyst, and Assistant Professor of Economics in US Army for over 20 years and is a combat veteran.

Valdis Krebs

Founder & Chief Scientist, Orgnet

Valdis Krebs is the Founder, and Chief Scientist, at Orgnet LLC, and a visiting researcher at the University of Latvia. Valdis is a management consultant, researcher, trainer, author, and the developer of InFlow software for social and organisational network analysis [SNA/ONA]. He has been mapping, measuring and morphing social, knowledge, business, and covert networks for over 25 years. His clients and partners include large and small businesses, branches of government, non-governmental organisations, consulting organisations, and universities in North America and Europe. Valdis is a pioneer in mapping important networks of interest using available public information. He was the first to produce and publish a network map of the 9/11 hijackers from major media sources after the 2001 event.

The Mortgage Meltdown: From Main Street to Wall Street

Download slides

Daniel Ladley

Senior Lecturer in Finance, University of Leicester

Dr Daniel Ladley is Senior Lecturer in Finance at the Department of Economics, University of Leicester and Deputy Director of the Leicester Institute of Finance. He holds a PhD in finance from Leeds University Business School. His research focuses on the application of computational and numerical techniques to problems in finance. He has published papers on microstructure, examining the effects of regulations and market structure, and the analysis of systemic risk in financial systems.

Recent years have seen rapid advances in the techniques used to measure the susceptibility of financial systems to systemic risk. Of particular note is the in- creased use of network based measures to empirically examine the effects of the failures of banks on the rest of the financial system. In this paper we employ these techniques to examine the spread of contagion in the American financial system in period around the 1873 banking panic. There are two purposes to this work. Firstly it enhances our understanding of the mechanisms at play during the 1873 crisis. Previous studies of this period have used written accounts and aggregated data to understand the behaviour of banks and the financial system. No analysis has employed the latest techniques in network analysis with the available data to quantitatively examine how the structure of the financial system, and the positions of the banks, might allow crises to spread. Our study aims to resolve debates in the economic history literature with regard to how this crisis propagated. Historical studies differ on this point, some argue that the crisis principally stemmed from New York and its effects were only significantly felt there. Others argue that the state of the periphery played a much more significant role. There are also arguments regarding the actual susceptibility of banks to failure – whether the crisis spread through loss of confidence or actual weakness in balance sheets.

The second purpose of this work is to increase our understanding of modern financial crisis. One of the key difficulties in understanding how balance sheet information translates into financial weakness in modern banks is the complex asset positions and off balance sheet activities which modern banks frequently adopt. An advantage of studying the period of history this paper focuses on, as opposed to the most recent financial crisis, is that the assets holdings of banks were much simpler. There was at the time no national over-night interbank lending system – the physical constraints of communication and transportation across long distances made this impossible. Similarly the number of derivative contracts, and in particular derivatives on financial firms or positions, was much smaller. In particular CDS were not traded for another 100 years and off balance sheets activities such as CDO’s and SPV’s were unheard of. As a result this historical financial system provides a much cleaner test bed for the analysis of contagion and systemic risk.

We base our analysis on data from the Annual Report of the Comptroller of the Currency. These reports are available from the Federal Reserve Archive and detail the full balance sheets of all U.S. national banks from 1863 onwards. In this paper we focus on the years surrounding the 1873 panic: 1872-1874. During this period there were approximately two-thousand banks spread between forty-two states and territories. One-hundred and eighty-two of these banks were located in fifteen reserve cities and a further forty-nine were based in the central reserve city of New York. Based on this we construct networks of financial connections. In similar work focusing on current financial systems maximum entropy techniques have been used to produce configurations of relationships based on minimal assumptions. This has been shown, however, to create an overly connected network which, in turn, is less susceptible to systemic risk than the real financial systems which are being modeled. To remedy this we incorporate the nature of the regulatory system and technological constraints to refine the network. During this period state banks were required to keep a portion of their deposit holdings in reserve city banks, such as those in St. Louis or Boston, whilst reserve city banks had to keep a portion of their reserves in the banks of the central reserve city – New York. The balance sheet data specifically lists the amount each bank is owed form reserve city banks and the amount owed form other national banks. This allows us to place constraints on viable network structures.

We also exploit the constraint that communication and the transfer of funds were not instantaneous at this time. In the modern financial system electronic communication technology means that if a bank wishes to lend money the physical location of the counter-party within a given jurisdiction has little impact. For a bank in New York, lending to a bank in Boston or San Francisco has little material difference. In the 1800’s, however, the amount of time it would take to transport and then recover funds from the latter over the former causes significant liquidity risks. The lack of communications technology also meant that the amount of information available about geographically distant banks was far less. As such we may model banks as having a preference for geographically close counter-parties. We use a range of specifications for the effect of distance in modeling this factor. In constructing the most likely network we numerically minimise the distance between counter-parties whilst maintaining compliance with regulations and balance sheet quantities. Due to the very large space of possible networks we employ a stochastic approach to this minimisation. As a result many viable networks are generated and then tested to provide robust conclusions.

 

We use historical data to construct a series of stress tests to analyse the vulnerability of the generated financial networks to the shocks. The scenarios are based on movements in the stock markets, defaults on railroad bonds, withdrawal of specie and the failure of counter party banks. The analysis is repeated across each of the three years of data. Analysis of 1872 allows us to consider the run up to the crisis and how close the banking system came to failure a year earlier than it actually occurred. By analysing 1874 we may determine if the actions taken after the crisis were adequate to reduce the risk of banks suffering adverse events. We focus our analysis on two aspects – the susceptibility of New York banks to shocks and the likelihood of contagion across the rest of the system. New York was the financial center at the time and so failure of banks in this city could have a significant effect on the stock market and international trade. Banks in the rest of the country were relatively smaller, however, their failure could cause serious issues for the communities they helped to support.

Paper co-authored by:

  • Daniel Ladley, Department of Economics, University of Leicester
  • Peter L. Rousseau, Department of Economics, Vanderbilt University

Download slides

Murat Ünal

Founder & Chief Executive, SONEAN and Funds@Work

Murat is Founder and Chief Executive of SONEAN GmbH (since 2013) and Funds@Work AG (since 2001). Both strategy consulting firms advise the asset management industry and use social network analysis to systematically uncover risks and opportunities as well as social networks’ impact on organisational outcomes. Murat also serves on the Dean’s Advisory Board of WHU and is a Member and on the Advisory Board Member of Itijah (Common Purpose) and is a trustee of their Frankfurt/Germany entity. With a network of well-known researchers from across the globe, Murat has created a group ‘SNA at Work’ dedicated to the application of social network analysis (SNA). Hereby they drive rigorous research combined with high relevance for organisations. Murat holds a Bachelor Degree in Commerce from the University of Adelaide, and an MBA from both the Kellogg School of Management and WHU at Otto Beisheim School of Management. He also has a LLM degree from Northwestern Law School as well as a Doctorate in Business Administration from IE Business School.

Going beyond Financial Data: The Social Network Analytic Perspective

Download slides

Tiziana Di Matteo

Professor of Econophysics, King’s College London

Tiziana Di Matteo is Professor of Econophysics. A trained physicist, she took her degree and PhD from the University of Salerno in Italy before assuming research roles at universities in Australia and Britain. She works in the Department of Mathematics at King’s College London in econophysics, complex networks and data science. She has authored over 80 papers and gave invited and keynote talks at major international conferences in the US, across Europe and Asia, making her one of the world’s leaders in this field.

We are witnessing interesting times rich of information, readily available for us all. Using, understanding and filtering such information has become a major activity across science, industry and society at large. Our society has become a global information processing system where news propagate and impact on the real economy at increasingly fast rates with increasingly large effects. It is therefore important to have tools that can analyse this information while it is generated and that can provide ways to reduce complexity and dimensionality while keeping the integrity of the dataset. Information content and flow are often associated with large degrees of redundancy both in time (repeating and scaling patterns) and across different variables (similarity, dependency and causality). Redundancy is often used to convey strength to the meaning or, more simply, it is the signal of recurring patterns with high statistical significance and therefore important.

In this talk we propose to use such redundancy to build an information based network that retains the relevant part of the data-interdependency structure. The structure of this network is a representation of the information in the dataset and such information can be efficiently analysed by using network-theoretic tools. The idea of using redundancy – namely correlation coefficients – to filter information in large-scale datasets by building networks of relevant links has been very actively studied in the literature mostly by means of two approaches: 1) the minimum spanning tree (MST)1,2 and 2) the planar maximally filtered graph (PMFG)3,4. The common idea underneath these two approaches is to retain the largest and most significant possible sub-graph while imposing global constraints on the topology of the resulting network. In particular, in the MST approach, the links with largest weights (e.g. correlations) are retained while constraining the sub-graph to be globally a (spanning) tree. Similarly, in the PMFG construction the largest weights (e.g. correlations) are retained while constraining the sub-graph to be globally a planar graph. The PMFG has richer information content than the MST with a larger number of edges (3N-6 instead of N-1, with N being the number of vertices) and the presence of 3- and 4-cliques. Planar filtered graphs are powerful tools to study complex datasets. It has been shown in5 that by making use of the 3-clique structure of the PMFG a clustering can be extracted allowing dimensionality reduction that keeps both local information and global hierarchy in a deterministic manner without the use of any prior information.

We show that applications to financial data-sets can meaningfully identify industrial activities and structural market changes6. Planar filtered graphs can be used to diversify financial risk by building a well-diversified portfolio that effectively reduces investment risk by investing in stocks that occupy peripheral, poorly connected regions in the financial filtered networks7. Another appealing advantage of planar filtered networks, such as the PMFG, concerns graphical modeling (e.g. Markov Random Fields) where the structure of the network ensures that exact inference algorithms can be performed in an efficient fashion8.

However, the algorithm so far proposed to construct the PMFG is numerically costly with O(N3) computational complexity and cannot be applied to large-scale data. There is therefore scope to search for novel algorithms that can provide, in a numerically efficient way, such a reduction to planar filtered graphs. We introduce a new algorithm, the TMFG (Triangulated Maximally Filtered Graph)that efficiently extracts a planar subgraph which optimises an objective function. The method is scalable to very large datasets and it can take advantage of parallel and GPUs computing. The method is adaptable allowing online updating and learning with continuous insertion and deletion of new data as well changes in the strength of the similarity measure9.


1 R.C. Prim, Bell-System Technical Journal 36 (1957) 1389-1401.

2 R.N. Mantegna, Eur. Phys. J. B 11 (1999) 193-197.

3 T. Aste, T. Di Matteo, S. T. Hyde, Physica A 346 (2005) 20.

4 M. Tumminello, T. Aste, T. Di Matteo, R. N. Mantegna, PNAS 102, n. 30 (2005) 10421.

5 W.-M. Song, T. Di Matteo, and T. Aste, PLoS ONE 7 (2012) e31929.

6 N. Musmeci, T. Aste, T. Di Matteo, “Clustering and hierarchy of financial markets data: advantages of the DBHT”, arXiv: 1406.0496 [qfin.ST] submitted 2014.

7 F. Pozzi, T. Di Matteo and T. Aste, Scientific Reports 3 (2013) 1665.

8 Amir Globerson Tommi Jaakkola. Approximate inference using planar graph decomposition. In Advances in Neural Information Processing Systems 19: Proceedings of the 2006 Conference, Vol.19, page 473. MIT Press, 2007.

9 G. Previde Massara, T. Di Matteo, and T. Aste, draft (2014)

Paper co-authored by:

  • T. Aste, Department of Computer Science, University College London and Systemic Risk Centre, London School of Economics and Political Sciences
  • T. Di Matteo and N. Muscmeci, Department of Mathematics, King’s College London
  • G. Previde Massara, Department of Computer Science, University College London

Christian Brownlees

Assistant Professor, Universitat Pompeu Fabra

Christian Brownlees is Assistant Professor at Universitat Pompeu Fabra and Barcelona GSE Affiliated Professor. He received his undergraduate degree in Economics and Quantitative Methods and PhD in Statistics from Università di Firenze. He was a Post-Doc Research Fellow at NYU Stern until 2011. Professor Brownlees has been working extensively in the field of systemic risk and econometric network analysis.

Abstract

Following the financial crises in the US and Europe, hundreds of systemic risk indicators have started being scrutinised by regulators to take pulse of the economy. When studying such an extensive collection of indices, it is natural to ask to which extent the indicators comove. In this work we analyse a panel of 156 systemic risk indicators from the ESRB risk dashboard from January 1999 to December 2013. We study cross-sectional dependence using a factor-network approach: dependence among indicators is determined by a set of common factors and a network of pairwise partial dependence relations among individual pairs. The empirical analysis shows that network interdependence is prominent while factors explain a relatively small proportion of overall covariation. Macro, credit and funding risk categories are the most central and highly interconnected indicators in the network. Network linkages are stronger among indicators in the same risk categories rather than indicators belonging to the same geographic area. Finally, the network centrality analysis shows that macro-imbalance indicators like current account balance, residential property prices, corporate sector indebtedness and bank loans-to-deposits ratios are the most interdependent systemic risk bellwethers of the dashboard.

Paper co-authored by:

  • Christian Brownlees, Department of Economics and Business, Universitat Pompeu Fabra and Barcelona GSE
  • Daniele Frison, European Central Bank and European Systemic Risk Board

Download slides

Ben Craig

Senior Economist, Deutsche Bundesbank Research Department

Ben Craig is a Senior Economist at the Deutsche Bundesbank Research Department, in charge of the networks research group and visiting from the Federal Reserve Bank of Cleveland’s Research Department. He has published on a wide variety of economic topics in journals which include the American Economic Review, the Journal of Political Economy, and the Journal of Monetary Economics. His current interests include empirical applications and tests of network theory. His recent research includes work on payments systems, bank failures, defining the core in a core-periphery model, and estimating network effects from time-series and cross section data, using regularization methods. He and his wife have three children.

Abstract

This paper computes data-driven correlation networks based on bank stock re- turns of international banks and conducts a comprehensive analysis of their topological properties. We first apply spatial dependence methods to filter the effects of strong common factors and a thresholding procedure to select the significant bilateral correlations. The analysis of topological characteristics of the resulting correlation networks shows many common features that have been documented in the recent literature but were obtained with private information on banks’ exposures. Our analysis validates these market-based adjacency matrices as inputs for spatio-temporal analysis of shocks in the banking system.

Paper co-authored by:

  • Martin Saldias, International Monetary Fund
  • Ben Craig, Deutsche Bundesbank

Download slides

Session seven: Networks in banking

Ilya Pollak

Associate Professor of Electrical & Computer Engineering, Purdue University

Ilya Pollak received his BS and MEng degrees in 1995 and PhD in 1999, all from MIT, all in electrical engineering. In 1999-2000, he was a post-doctoral researcher at the Division of Applied Mathematics, Brown University. Since 2000, he has been with Purdue University where he is currently Associate Professor of Electrical and Computer Engineering. His research interests are in machine learning and statistical models applied to financial time series analysis, networks, and image processing. He has collaborated with financial companies on quantitative models for US equities and trade execution algorithms. Professor Pollak received an NSF CAREER award in 2001. He was Lead Guest Editor of the IEEE Signal Processing Magazine Special Issue on Signal Processing for Financial Applications in September 2011. He was the general chair of the IEEE Symposium on Signal and Information Processing in Finance and Economics in December 2013.

José-Luis Molina Borboa

Central Bank of Mexico

José Luis Molina-Borboa is a financial researcher from Banco de México, where he has been using network and statistical models to study financial stability and measure systemic risk in Mexico since 2013. He finished his undergraduate studies in Applied Mathematics at Instituto Tecnológico Autónomo de México (ITAM), having taken some courses at Université Paris-Dauphine (Paris IX).

In this work we investigate the importance and persistence of bank relationships in two important funding markets: the unsecured and the repo market. Additionally, we study persistence in other two important networks: the derivatives and the cross holding of securities networks.

The unsecured interbank market is the ultimate market for liquidity and in Mexico such market is very important to level out liquidity between banks at the end of the business day. On the other hand, the repo market is the most important source of funding for banks, although the interbank repo market represents only a fraction of the total funding, most of which comes from legal and natural persons.

Unlike previous works, we do not construct payments or exposures networks for our analysis but we construct networks on the basis of interactions between banks. In particular, we investigate the persistence and the overlapping of trading relationships by using daily transactional data in such markets. The usefulness of network similarity measures in finding persistences over time is also studied in this paper.

In this work we show that trading relationships do overlap for some pairs of banks and that link persistence is higher in the repo market. However, trading relationships are more important in the unsecured interbank market as there is no collateral involved in this type of loan. We also check the overlapping and link persistence with other important funding network, the cross-holding of securities network. The findings for this network are quite revealing in the sense that the link persistence on this network is a lot higher than with the other funding networks and the overlapping with the other segments of activity low but persists over time. This means that by analysing different networks we can identify really important relationships between banks and such relationships prevail even in times of stress, which may not be evident when analysing only one network at a time: this gives rise to the interaction between different layers in the multiplex structure of financial networks.

Reciprocity also plays an important role in identifying the persistence of relations in a banking system. By looking at different networks, a wider interpretation of reciprocity can be obtained by allowing institutions to correspond themselves across different markets according to their own profiles and strategies.

Finally, we also investigate whether the most interconnected banks in one market are the same in the other segments of interaction. The possibility of identifying relevant (highly interconnected) players in different markets, beyond simple aggregated measures, is a very important task for financial authorities given the recent evidence that the financial system is a highly interconnected one. Nevertheless, not much work has been done in assessing the role of the multiplex structure in the interconnectedness of a system.

Paper co-authored by: José-Luis Molina Borboa, Marco van der Leij, Serafin Martinez-Jaramillo and Fabrizio Lopez-Gallo.

Rodrigo Cifuentes

Head of Financial Research, Central Bank of Chile

Rodrigo Cifuentes has been the Head of Financial Research at the Central Bank of Chile since April 2010. This unit is in charge of the design and supervision of the Chilean Survey of Household Finances, the Stress Tests of the Chilean Banking System, and of conducting research on macro-financial issues. Between 2006 and 2010 he was Head of Financial Stability, being in charge of the Financial Stability Report of the Central Bank of Chile. As a senior economist he led the team that designed and produced the Financial Stability Report that the Central Bank of Chile has published since 2004. He visited the Bank of England twice as a visiting scholar between 2002-2003 and 2005-2006. He holds a MA and PhD degree in Economics from Harvard University.

This paper uses network analysis to derive an optimal policy to contain systemic risk produced by contagion via direct exposures among banks. We circumvent the standard critique to the lack of behavioural response in counterfactual analysis by using, in addition to effective data on interbank exposures, a simulation framework to assess the impact of policies under a large range of possible configurations of the banking sector.

We measure the systemic risk produced by a bank measuring the losses that its failure would imply on others. We model three channels of contagion. To the standard channel via direct credit exposures (seminal papers applying this approach are Sheldon and Maurer, 1998; Furfine, 1999; and Upper and Worms, 2001), we add losses coming from the funding side of banks (Chang-Lau et al, 2009). In particular, those that generate when the failure of a bank implies the loss of a source of funding and the debtor bank has to liquidate assets at some fire-sale cost. This loss is determined, in addition to its funding exposure to the failing bank, by the liquidity of its assets, which is considered in the calculations. Finally, we contribute to the relevance of this methodology in policy analysis by parsimoniously including the possibility of runs against banks affected by the failure of the originally failed bank.

We have data of effective interbank exposures among Chilean banks from December 2012 to date. However, in order to derive more general results, we also simulate a range of fictitious banking systems both on the effective and in the fictitious systems, we test the effectiveness of different policies to contain systemic risk. We consider, as a benchmark, bilateral limits to interbank exposures for a range of values. We assess the effectiveness of policies in terms of the reduction in systemic risk – measured by the losses in equity in the system conditional on the failure of a bank – vis-a-vis the reduction in the size of the interbank market. We would rank higher policies that achieve a larger reduction in systemic risk at a lower cost in terms of reduction of the interbank market.

Against that benchmark we test other policies. In particular we test policies that are “systemic” in the sense that they do not limit exposures bilaterally, but rather they set limits to the exposure of a bank against the rest of the system (Cifuentes, 2004; Cont et al, 2013). In the case of credit exposures, the limit is set in relation to the equity of the rest of the banking system. In the case of liquidity provision, the limits are set in relation to the assets of the rest of the banking system. In addition, we test Basel Committee proposal of limiting the bilateral exposure amongst systemic banks (Basel Committee on Banking Supervision, 2014, section II, D-16). The exible nature of our simulation framework allows us to do that.

Our results indicate that systemic policies are more efficient in containing systemic risk. In the case of more than one systemic bank, they dominate policies that focus in the bilateral exposure among them, as the one proposed by the Basel Committee.

Paper co-authored by:

  • Rodrigo Cifuentes, Central Bank of Chile
  • Rubén Poblete-Cazenave, University College London
  • José Gabriel Carreño, Central Bank of Chile

Download slides

Grzegorz Halaj

Financial Stability Expert, European Central Bank

Grzegorz Halaj is a financial stability expert at the European Central Bank dealing with stress testing issues (Troika program country stress tests and the EBA/SSM EU wide stress test) and financial contagion modelling and assessment. Before joining ECB he worked in ALM office of a private bank and as researcher in financial mathematics and in contagion studies (Warsaw School of Economics, CORE in Louvain-la-Neuve, King’s College in London and Fields Institute in Toronto). Grzegorz holds a PhD in financial mathematics and his research interests have been recently focusing on agent-based modelling of contagion and banks’ behaviours in light of the theory of the optimal portfolio choice.

Abstract

This paper uses network formation techniques based on the theoretical framework of Halaj and Kok (2014) to construct networks of lending relationships between a large sample of banks and non-financial corporations in the EU. Networks of bank-firm lending relationships provide an alternative approach to studying real-financial linkages, which takes into account the heterogeneous characteristics of individual banks and firms on the propagation of shocks between the financial sector and the real economy. One particular strength of the model is related to the fact that the proposed framework provides an assessment not only of how banks are directly related to each other in the interbank market but also how they may be indirectly related (due to common exposures) via their corporate lending relationships. The model can be used to conduct counter-factual simulations of the contagion effects arising when individual or groups of banks and firms are hit by shocks. This could allow policy makers to gauge specific vulnerabilities in the financial system evolving around the lending relationships between banks and their (corporate) borrowers.

Paper co-authored by:

  • Grzegorz Halaj, European Central Bank
  • Urszula Kochanska, European Central Bank
  • Christoffer Kok, European Central Bank

Download slides

Session eight: Macro networks

Peter Sarlin

Associate Professor of Economics, Hanken School of Economics

Peter Sarlin is an Associate Professor of Economics at Hanken School of Economics (Helsinki, Finland), and Head of RiskLab Finland. Currently, he is visiting the Center of Excellence SAFE at Goethe University Frankfurt. Peter received his PhD degree in Information Systems with distinction at Åbo Akademi University (Turku, Finland) in June 2013. He has also studied at London School of Economics, Stockholm School of Economics and Stockholm University. Peter is a regular visitor at the European Central Bank and Bank of Finland. His research interests include systemic risk, macroprudential supervision, data and dimension reduction and visual analytics. He has published his research in journals from various fields, including Journal of Banking & Finance, Economics Letters, Ecological Informatics, Neurocomputing, Information Visualization, Pattern Recognition Letters and Knowledge and Information Systems.

Abstract

This paper investigates the use of financial linkages as leading indicators of banking crises. While conventional early-warning models make use of macro-financial indicators for signalling an impending crisis, we enrich traditional models by also considering country-level and sector-level financial linkages. Using macro-networks, which represent connections between the main financial and non-financial sectors of the economy for the European countries, we are able to quantify the position of the banking sector in terms of interconnectedness. In terms of useful early-warning signals yielded to policymakers, we show that measures constructed from the macro-network are determinants of banking crises.

Paper co-authored by:

  • Tuomas Peltonen, European Central Bank
  • Michela Rancan, European Commission JRC
  • Peter Sarlin, Goethe University Frankfurt and RiskLab Finland

Download slides

Ilja Kavonius

Adjunct Professor, University of Eastern Finland

Ilja Kristian Kavonius is a Senior Economist-Statistician at the European Central Bank. Additionally, he holds a Doctor of Social Sciences and is an Adjunct Professor (dosentti) at the University of Eastern Finland in the Faculty of Social Sciences and Business Studies. Before joining the ECB he worked in the United Nations Economic Commission for Europe and Statistics Finland. His work and research is mainly focused on networks based on integrated accounting systems, measurement of economy, economic development, risk and wellbeing both in economics and economic history. He has long experience of different aspects of statistics especially national accounts. In this area he is, and has been a member of several European and world-level expert groups and task forces.

Download slides

Daniel Moran

Researcher, NTNU (Norwegian University of Science and Technology)

Daniel Moran is an economist working with multi-region input-output (MRIO) accounting. He is based in Trondheim, Norway at the Industrial Ecology programme of NTNU. Daniel holds a PhD from the School of Physics at the University of Sydney, where he helped build the Eora global MRIO system. Prior to that he worked at MSCI in New York. Results based on the Eora MRIO have appeared in Nature and PNAS and have been used by the European Commission, the OECD, the UN, and the World Bank.

Download slides

Top