Does Bank Proximity Still Matter?

Research shows physical distance constitutes and important determinant of the extent to which small firms access bank loans. To decide on a loan application, banks collect and assess information about the borrower, and borrower proximity is said to facilitate the collection of this information, especially for the so-called “soft information”.

This soft information refers to information that is hard to transmit and cannot be directly verified by anyone other than the agent who produces it (Stein, 2002). It comes in many shapes and forms. For instance, loan officers often incorporate soft information into their lending decisions by assessing the character of a firm’s manager. The officer may personally know the firm’s CEO and deem her an honest and reliable person. This constitutes information of great value for making credit decisions but at the same something intangible and hard to verify. Other examples of soft information include opinions, ideas, rumours, economic projections, statements of management’s future plans, and market commentary (Liberty and Petersen, 2018). These forms of information are usually collected in person which is why physical distance has been shown to affect lending decisions. 

As with many other industries, the internet has removed many physical barriers for trade and for the transition of information – including soft information. Online communication channels (e.g. video conferencing) have become almost as “good” as face-to-face ones. It would be reasonable then to expect physical distance not to be as important for lending decisions as reported in the past. 

Recent data released by the U.S. Treasury can serve to shed some light on this matter. The U.S. government, through the Small Business Administration (SBA), instituted a special loan program designed to provide financial assistance to small businesses impacted by the current Covid-19 crisis. This loan program, denominated Payment Protection Program (PPP), was intended to provide direct incentive for small businesses to keep their workers on payroll. To access these funds, small business would submit a loan application through any federally insured depository institution (mostly banks) under very favourable conditions (including loan forgiveness) provided the loans were used for eligible payroll costs. More than five million loans were generated according to the data released. This loan data is useful because it contains (uncommon) detailed information about each loan, including the borrower’s address and the name of the originating bank.

I gathered a random sample of more than 200k loan records and pinned down the exact geolocation of each borrower (firm) using Google’s geolocation API. Then, using the geographic location of each bank branch in the U.S., I calculated the geodesic distance between each borrower and the nearest branch of the lending bank. The following map shows the location of each PPP borrower colour-coded by distance to their corresponding lender.

Roughly 90% of the sampled borrowers obtain their loans from banks within a 100 km radius (in blue), whereas approximately 5% borrowed from lenders located between 100 km and 1,000 km away (in red). This is consistent with the notion that physical distance matters for lending, especially for small business loans whose approval relies heavily on soft information. Surprisingly, however, the remaining 5% corresponds to borrowers who obtained PPP funds from lenders more than 1,000 km away. In fact, the vast majority of those loans correspond to borrower-lender distances of more than 3,600 km, roughly the distance between the U.S. East and West coasts. 

The following histogram offers more insight into the distribution of the PPP borrower-lender distances. Because this distribution is heavily right-skewed, I plot the x-axis in a logarithmic scale. This means that moving a unit distance along the x-axis corresponds to multiplying the previous number (say 10 km) by ten. The histogram shows that most firms borrow from lenders located nearby. In fact, 75% of these borrowers purchased loans from banks less than 10 km away. However, it also shows that the distribution of distances has a right fat tail, which corresponds to firms whose lending bank’s nearest branch is located more than 3,000 km away.  Of these long-distance loans, the majority were originated by Cross River Bank. This is a relatively small bank with a single branch in Fort Lee, New Jersey that was featured in Fortune for originating $5.6 billion in PPP loans – No. 12 in the list of PPP loan originators – despite its small size. 

It turns out that Cross River Bank managed to accomplish this impressive amount of loan origination by partnering with FinTechs to automate its loan application process and serve businesses online. So, instead of relying on a broad branch network – which is how small businesses have traditionally accessed credit in the past – Cross River offered its loan services virtually, and quite successfully given the staggering amount of PPP loans supplied.

So, the PPP data suggests that more and more the so-called soft information can and is being transmitted online. This has important implications for small firms’ access to financial services. By removing the reliance on proximity for the acquisition of information, online lending allows for small businesses to have equal access to fairly priced loans, irrespective of their geographic location. Nevertheless, given the uniqueness of PPP loans (e.g.  no credit risk), it is unclear to what extent soft information was collected for loan approvals. Still, this may be an example of how the future of lending – and the acquisition of soft information – may look in the future.

References

  • Stein, J. C. (2002). Information production and capital allocation: Decentralized versus hierarchical firms. The Journal of Finance57(5), 1891-1921.
  • Liberti, J. M., & Petersen, M. A. (2019). Information: Hard and soft. Review of Corporate Finance Studies8(1), 1-41.

Slowdown of COVID-19 in Australia

Since its first reported case on 25 January, Australia’s response to the COVID-19 pandemic has been unique. Rather than imposing strict lockdowns aimed at halting non-essential activities for a few weeks (which has been the prevalent approach around the world), the Federal Government’s emphasis has been on implementing social-distancing rules viable for at least six months.

On March 22, and after 28 days of the first confirmed case, Prime Minister (PM) Scott Morrison announced Stage 1 of a set of social distancing measures aimed at reducing social interactions while minimising economic disruption. The PM stated:

“We will be living with this virus for at least six months, so social distancing measures to slow this virus down must be sustainable for at least that long to protect Australian lives, allow Australia to keep functioning and keep Australians in jobs.”

The initial set of measures included restrictions to social gatherings of more than 500 people outside or more than 100 people inside, the closure of entertainment venues, and suggestions on how to practice social distancing.

These restrictions were tightened on 29 March following what has been, so far, the highest daily increase in confirmed cases, with 460 new cases reported on 28 March. These Stage 2 measures reduced indoor and outdoor gatherings to two persons only, and Australians were strongly advised to stay home unless for essential activities; such as shopping for supplies, medical needs, and exercise. But, despite these stricter measures, the government maintained its commitment to policies that could be sustained for at least six months while balancing out the health and economic effects of the pandemic.

Although, these restrictions have resulted in thousands of jobs lost and entire industries in desperate need for help, so far, they have been effective in controlling – and even reducing – the spread of the virus in the community. This figure shows the seven-day moving average growth rate in COVID-19 confirmed cases since Australia reported its 100th case. After peaking at nearly 25% in early March, this rate has dropped to its lowest level at 4.5% on 6 April, 28 days after the country reached 100 cases.

australia_coronavirus_geps

In comparison, the average growth rate between the more severely affected countries in Europe and the US, was almost four times higher at 16.1%, after reaching 100 cases. Even China, one of the most successful countries in containing the spread of the virus – according to the official figures – was sitting at a 10% growth rate 28 days after reaching their 100th case.

Overall, and despite specific blunders – such as the infamous Ruby Princess case – State and Federal governments in Australia have been successful in “flattening the curve” and slowing the spread of COVID-19. Let’s hope this trend continues.

 

 

Risk-Neutral Densities

Assuming a complete and arbitrage-free market, a fundamental result of asset pricing theory is the existence of a unique probability function (measure) under which the price of any asset can be represented as the discounted expectation of the asset’s future payoffs. For instance, the price of a call option can be expressed as:

\displaystyle C(K, \tau, r) = e^{-r \tau} \int_K^{\infty} (S_T - K) f(S_T) dS_T

where, K is the option’s strike price, \tau the remaining time until expiration, r the risk-free rate, and S_T is a random variable representing the underlying stock price at expiration. By definition, the payoff of a call option is \max\{S_T - K,0\}, and hence the integral is taken over the interval in which the payoff is positive (i.e. S_T > K).

The function f(S_T) is called the risk-neutral density of S_T and can be intuitively thought of as a standard probability density function that combines investors’ own risk preferences with their beliefs about the true distribution of S_T.

This risk-neutral density is not directly observable, however, a simple and yet remarkable result in option pricing theory, known as the Breeden-Litzenberger formula, allows one to estimate f(S_T). This formula is the result of taking the second derivative of C(K, \tau, r) with respect to K. That is:

\displaystyle \frac{\partial^2 C(K, \tau, r)}{\partial K^2}= e^{-r\tau} f(K)

Or rearranging terms:

\displaystyle f(K) = e^{r \tau} \frac{\partial^2 C(K, \tau, r)}{\partial K^2}

Hence, for any given stock, provided that there are a “reasonable” number of call (or put) options with varying strike prices, it is possible to approximate the underlying risk-neutral density f(S_T) as:

\displaystyle f(K) = e^{r \tau} \frac{C(K + \Delta K, \tau, r) - 2C(K, \tau, r) + C(K - \Delta K, \tau, r)}{(\Delta K)^2}

A perfect candidate for using this risk-neutral density estimation method is the S&P 500 index. Options on this index are by far the most traded options in the US. These options come in a wide range of strike prices which can be used to implement the risk-neutral density estimation procedure described above.

The next figure is the result of implementing this risk-neutral density estimation on the S&P 500 for the month of October 2008, a period of great turmoil for equity markets around the world (implementation code here). In addition to applying the Breeden-Litzenberger formula, the estimation also employs spline interpolation to generate a smooth risk-neutral density, and the Generalised Extreme Value (GEV) distribution to complete the density tails. Further details about this estimation procedure can be found here.

risk_neutral_density_geps

The result (in blue) is a left-skewed density which encapsulates investors beliefs about the true distribution of the S&P 500 index. As with any other probability density, the area under the curve and between two points on the x-axis can be interpreted as the probability S_T is between those two points at expiration. In addition, the figure shows (in red) the probability density function of a lognormal distribution with the same mean and variance as the risk-neutral density. Lognormal distributions are extensively used in finance to describe how equity prices (such as the S&P 500 index) behave, and thus it serves as a benchmark for comparison. For instance, the probability the S&P 500 index experienced a 40% drop in October 2008 was roughly 8 times higher under the risk-neutral density compared to the traditionally assumed lognormal distribution.

In summary, the Breeden-Litzenberger formula, complemented with a highly liquid options market, allow us to summarise investors beliefs about the probability of different future outcomes in a simple density plot. In a sense, this density plot is a visual representation of all the information there is available on a particular equity asset and its price uncertainty.

US Media Preferences

In the US, people on both sides of the political spectrum claim media outlets do not report political issues fairly. Out of three most-watched networks, CNN and MSNBC are considered liberal-biased, whereas for many people Fox News is the paramount example of a conservative-biased news source.

This animation shows the most-watched news outlet by county between 2005 and 2017. Blue for counties in which the combined TV ratings of CNN and MSNBC (the liberal-biased outlets) are higher than those of FOX News, and red for counties where Fox News is the dominant news source. The percentages next to the networks’ names represent the proportion of US counties for which those networks where the most-watched that year.

media_preferences_geps

Interestingly, the most-watched news network varies considerably both, across counties and through time. In the period leading up to the 2008 presidential election, liberal outlets ratings were on the rise. The year president Obama won his first presidential election, liberal outlets were the most-watched in 45% of US counties – the highest point for liberal outlets in the period observed. During the entire Obama administration, however, liberal outlets only took first place as most-watched in less than 26% of all counties. Could this be the effect of a conservative backlash to a president perceived as fully embodying liberal values?

The 2016 presidential election seems to be another turning point for media preferences – with many counties turning blue. By 2017, 37% of US counties have liberal outlets as most-watched. A liberal backlash to the election of president Trump? Backlash or not, media preferences seem to correlate negatively with the political affiliation of the president in office.

US Political Polarisation

A few minutes browsing online and we’ll encounter hundreds of people voicing their strong views on a wide range of topics such as immigration, religion, and the never ending free market versus interventionist state” debate. This reality speaks to a continuing polarisation of world views amplified nowadays by social media platforms such as twitter and facebook.

In the US, the last few decades have seen political polarisation on the rise with the phenomenon becoming more prevalent in recent years. Several causes have been identified for this growing trend. These include, growing racial and ethnic diversity, new digital media outlets, the rise of identity politics, among others (see The Top 14 Causes of Political Polarization).

Academics have also weighed into this issue as early as 1984 with empirical measures aimed to quantify this polarisation. For instance, the political scientists Keith Poole and Howard Rosenthal developed a statistical method called NOMINATE to determine the ideological position of US congress members based on their voting records. The first dimension of the NOMINATE score measures where every congress member fits on the economic Liberal-Conservative spectrum. In this sense, the score represents each congress member ideological view with respect to the fundamental role of government in the economy.  The values of this ideology score range from -1 for the most liberal legislators to 1 for the most conservative ones.

polarisation-logo

The figure here builds on the methodology developed by Poole and Rosenthal and shows the distribution of their ideology score for all US congress members from the 81st congress (1950) to the 115th (2018). As expected, democrats have predominantly negative scores and republican positive ones. Also, by construction, values close to zero represent centrist legislators. That is, those who are more likely to “cross the aisle” and support bills sponsored by the opposite party. Given the lack of overlap of the two distributions in recent years, these centrist legislators are now non-existent.

The recent absence of more moderate congress members can be interpreted as another symptom of a widespread polarisation in US politics.  A major problem of a polarised political landscape is that it lessens the probability of bipartisan solutions for the most pressing (and perhaps even existential) problems of today’s world such as climate change.

Brownian Motion

As a mathematical model, the Brownian Motion is well known for being explicitly delineated by no other than Albert Einstein in one of his “Annus Mirabilis” papers of 1905. In it, the Brownian Motion played a critical role in ending the then heated debate over the existence of atoms. However, the French mathematician Louis Bachelier was, in fact, the first to model the now called Brownian Motion as part of his PhD dissertation. Unlike Einstein, Bachelier created the Brownian to value stock options and also unlike Einstein, the importance of Bachelier’s contribution was not well appreciated at the time[1].

Nowadays, the Brownian Motion (also called Wiener process) is a cornerstone of current mathematical finance. It has been widely used to model the time trajectory of the price of financial instruments such as stocks, bonds, and options. It is also a key component of the Black-Scholes-Merton’s option pricing model who won Myron Scholes and Robert C. Merton the Nobel Memorial Prize in Economic Sciences in 1997[2].

Until not too long ago, the Brownian Motion was an enigma to me. I appreciated its importance and even recognised how to apply it to some finance related problems but its characteristics, and more importantly its construction, remained puzzling and elusive. This is no doubt why I was surprised to learn that its construction, however complex, can become very intuitive one we start looking closely. One approach to building a Brownian Motion hinges on a special family of wave-like functions called wavelets. Adding a certain kind of randomness to a particular wavelet approximation produces the famous Brownian Motion. For more details have a look at Chapter 3 of Michael Steele’s “Stochastic Calculus and Financial Applications”.

In this animation, I’ve used this wavelet-based construction to simulate and visualise three different Brownian Motions on the time interval 0 to 1. (implementation code here). The way to think about these three instantiations of the Brownian is to picture nature choosing different states of the world (three in this case). For each different state of the world, there is an entire realisation of the Brownian represented by the three different lines in the graph.

However beautiful, the simplicity of this construction runs the risk of understating the importance of a mathematical model that constitutes the bedrock of an entire branch of mathematics called stochastic calculus.

[1] Nevertheless, many consider Bachelier the Father of Modern Option Pricing Theory.
[2] Fischer Black was not eligible for the prize due to his death in August 1995.

The Decline of Physical Banking

New technological developments are reshaping the way financial services are delivered. From new peer-to-peer lending platforms to cryptocurrencies, the so-called FinTech industry seems to start gaining steam and its full potential is still uncertain.

One of the financial sectors more influenced by new technological developments is the traditional commercial banking system. Many economies have certainly moved really far from the old-fashioned banking model where people would physically commute to their closest bank branch, greet their well-known bank teller and spend long periods (queuing up mostly) to complete mundane transactions such as opening a bank account, depositing money, or applying for a mortgage. Today, some of these transactions still require some sort of physical interaction with our local banks, however, most of them are now performed online or are ceasing to exist (when was the last time you made a bank deposit?)

Nevertheless, the extent of the ongoing transformation is still unclear. The EY FinTech Adoption Index provides important insights into how different markets are changing their consumption habits towards FinTech. I believe, however, that there is a much simpler way of appreciating this transformation. The next figure presents a less sophisticated (but arguably more direct) way of observing direct changes to the way financial services are consumed. At least as far as commercial banking trends go.

physical_bankingLOGO

This figure shows the average number of bank branches in US counties over time, along with the average amount of deposits per bank also at the county level. Unlike the number of bank branches, bank deposits have increased steadily over the last 20 years. This speaks to the fact that, despite recent disruptions to the system (the GFC and FinTech competitors for instance) the banking industry has experienced growth. Nevertheless, the average number of branches per county has decreased from a high of 49.5 (per 100,000 people) in 2008 to 45.5 today. These recent developments could very well attest to the reshaping of the way financial services are consumed and the current influence of FinTech.

20 Years of Financial Access in the US

Despite recent changes in the way financial services are provided, to this day, commercial banks are still arguably the most important providers of saving and investment products. These financial services allow households to trade off consumption today and tomorrow in a process referred to as consumption smoothing. The process of trading off consumption today for consumption tomorrow (and vice-versa) allows households to “maximise” their lifetime utility (as an economist would put it), which in simpler terms means to look for a proper balance between spending and saving during different stages of our lives so that consumption is more or less “stable”.

For instance, retirement accounts enable people to set aside money during their working life to then be consumed in retirement. Similarly, a personal loan allows us to increase consumption today at the expense of a potential reduction in consumption tomorrow. These two are examples of financial services offered by commercial banks whose access (or lack thereof) could have a significant impact on households living standards and their ability to mitigate unexpected events (e.g. job loss).

Despite its importance, the extent to which households have access to financial services (also referred to as financial inclusion) is hard to measure. Most available indicators rely on survey data and unavoidably have to trade depth for breadth, providing very detailed financial access data but usually at aggregate levels (see Global Findex).

financial_access_geps_optimized

This map attempts to provide a more granular measure of financial access. One that provides geographical differences in the access to financial services across US counties. Specifically, the map characterises the number of bank branches per 100,000 people for each US county. Despite new technological developments, proximity to financial providers is still an important determinant of whether people can access basic financial services such as bank accounts and personal loans. Hence, having more branches in a specific region (after adjusting for its size) can serve as a proxy for its overall state of financial inclusion. Over the last 20 years, counties in the US have experienced important changes in this measure of financial inclusion. Relative to the late 1990s, nowadays most counties present higher levels, however, geographical differences persist and seem to be very consistent over time.

There are of course important limitations with this characterisation of financial access. One can argue these geographical differences could be “demand-driven”, that is, caused by differences in the degree to which certain regions prefer to consume these services. Nevertheless, given the aforementioned importance financial services have in allowing people to smooth consumption and mitigate risks, this measure may still provide important information on where people are able to balance their spending more effectively.

US Banks and the 10 Billion Mark

The Global Financial Crisis of 2008 unleashed a series of regulatory changes around the world aimed to overcome the market flaws that led to the collapse of several financial institutions and billions of dollars spent on bailout programs. As a direct response, in 2010 the US Congress passed the Dodd-Frank Wall Street Reform and Consumer Protection Act. Among other important changes, Dodd-Frank tightened up the regulatory framework for commercial banks, especially for large institutions considered to be an important conduit of the financial meltdown that took place between 2007-2009.

Specifically, under Dodd-Frank institutions with more than 10 billion in assets are subject to a stricter set of regulations which includes caps to fee income and the obligation to conduct annual stress tests. These more demanding regulatory requirements impose extra costs, and perhaps this exactly why the size distribution of commercial banks in the US has changed in recent years. The following figure shows the size distribution of commercial banks (roughly 200 of them) with assets between 9 and 11 billion dollars before and after Dodd-Frank was enacted.

dodd_frank_mark

The fact that currently a larger proportion of banks are observed below the 10 billion mark (relative to before the change in regulation) may constitute an instance of regulatory arbitrage. So as to avoid a stricter regulatory environment and higher operation costs, banks in the US purposely remain below the 10 billion mark. To do so, these financial institutions may decide to slow down their credit production or transition towards less conventional banking models such as the so-called originate-to-distribute model. Hence, lower lending growth and/or higher market risk (through bank securitisation for instance) could amount to unintended consequences of this key reform.

Earlier this year, the Trump administration successfully rollbacked some of Dodd-Frank’s most important provisions, effectively increasing the threshold for banks to be subject to stricter federal oversight. It is yet to be seen how these new changes will reshape the US banking sector.

 

The Case for South American Football

Because of its extraordinary local leagues and great players, Europe is arguably the most competitive football region in the world. According to the FIFA ranking of men’s national teams, European countries take 7 spots in the top 10 with Germany currently taking the lead, and with Brazil, Argentina, and Chile being the only non-European countries in this top 10. Also, of the 20 FIFA World Cups celebrated so far (excluding Russia 2018), 11 have been won by European teams.

This perhaps justifies the fact that since its creation, the FIFA World Cup features roughly 50% of European teams. However, considering that the remaining 9 World Cups were won by South American countries it can seem unfair that the number of South American teams has been kept to roughly 20% of the total national teams competing.

world_cup_mark

Well, to perhaps add to this argument, this graph shows how much more effective South American national teams have been in reaching the knockout stages of previous (and current) World Cups. The long-standing average percentage of teams from South America that passed the group stage sits at 66% and is 80% in the current 2018 World Cup, with only one team not qualifying to the next round and most South American teams leading their corresponding groups.

Whether South American football “deserves” more spots in the FIFA World Cup is hard to tell, but certainly South America as a region has been better than any other when it comes to overcoming the always challenging group stage. All in all, it may be surprising then that out of the 12 new spots announced for the 2026 World Cup, only 1 will be allocated to the South American Football Confederation (CONMEBOL).