11/15/2013

The Aggregate Liquidity Management Model of Interbank Rate Targeting

Introduction

In a previous post, we explored how the central bank’s balance sheet evolves over the reserve maintenance period. In this post, we’ll review one of two models Bindseil presents for how the central bank controls the interbank rate. Here, we focus on the “aggregate liquidity management model.”

Bindseil notes that there is a rich and heterogeneous literature related to these ideas, but that most of it places an emphasis on examining the empirical properties of the interbank rate, rather than modeling how the central bank controls it. So this is a bit "exclusive," if you will. The modeling approach that Bindseil focuses on has Poole (1968) at its origins. 

The Aggregate Liquidity Management Model with Certainty

The “aggregate liquidity management model" focuses on the banking system as a whole, as opposed to taking an individual bank's perspective. The other model I'll review in a future post adopts the latter approach.

To begin, assume perfect interbank markets. Also assume the following type of reserve maintenance period, which is one-day long and permits intraday overdrafts (which can be thought of as reserve averaging around 0):



Steering Interbank Rates under the Aggregate Liquidity Management Model

The aggregate liquidity management model gave us the following equation for what determines the interbank rate:

Equation A: it = iB*(P) + iD*(1-P)     

where it is the interbank rate, iB and iD
are the expected standing facility rates at the end of the maintenance period, P is the probability of recourse to the borrowing facility, and (1-P) is the probability of recourse to the deposit facility.

I used Equation A to create the following table, which describes the central bank’s policy options for changing interest rates. This is not in Bindseil’s book, but it directly follows from Equation A, and I think it is a nice way to condense the equation’s implications. Each cell is the result of keeping constant, changing, or not controlling some combination of the variables in Equation A.



Appendix: Explanation of the Math behind the Aggregate Liquidity Management Model of the Interbank Rate

This appendix explains the mathematical foundations for the "aggregate liquidity management model" of the interbank rate. All that's required to understand this a basic familiarity with probability and calculus.

The Individual Shock Model of Liquidity Management

In contrast to the aggregate liquidity model, another way to model interest rates is to take the individual bank’s perspective. This model may be more appropriate if reserve requirements are very low and/or if banks cannot average reserve requirements across the maintenance period. (Ultimately, a central bank may use both models as inputs to its decision-making.)

First, consider the idea that the demand for reserve balances depends on uncertainties relating to payment flows, which vary all the time. For instance, if a bank experiences a liquidity shock after the interbank market has closed and does not have enough reserves on hand, it will be forced to go into a costly overdraft with the central bank. Such uncertainty creates a demand for reserves, since reserves aren’t always acquirable in the interbank market or through cheap, daylight overdrafts.

To further develop this idea, let’s introduce the variable q, which is the amount of reserves each bank holds before the liquidity shock. Also recall the simplified form of Equation 6 discussed here: B - D = M - AAssuming this amount of reserves is spread equally across all banks N, q = (M – A) / N. If we assume the liquidity shocks, represented by the variable x, are normally distributed (with φ representing the probability density function (pdf) of the standard normal distribution – see this Appendix for an description of pdfs), then the expected value of the liquidity shocks is zero across all banks. EDIT: Also assume no deposit facility.

Let’s now consider the expected cost of going into overdraft, C(q). (Bindseil uses C(q). I would have preferred something more of the form E[f(x)].) A bank will go into overdraft if it has negative reserves at the end of the day: x+q < 0. The cost of borrowing in this case will be ib*(q+x). If (q+x) >0, then there is no need for the bank to borrow from the central bank, so there are no borrowing costs. In summary, the cost of borrowing, call it f(q), is:

f(q) = iB*(q+x) , if x < -q
f(q) = 0 , if x > -q

We can now calculate the expected cost, C(q), of going into overdraft:


See the A
ppendix for an explanation of this equation if its form is not familiar to you.

From this, one can understand that if the bank were to increase the number of reserves q that it holds, then the probability of it going into overdraft should decline, and with that, the expected cost of going into overdraft. But holding more reserves q entails a cost as well, which is the interbank interest rate. Thus, a bank will only acquire more q if the cost of doing so, i, is less than or equal to the benefit of doing so, which is the corresponding decline in the expected cost of going into overdraft C(q). Or, another way of viewing it is that the interbank rate, which is the opportunity cost of holding reserves, should settle at a level where banks are indifferent between holding one more unit of reserves and not. This is econ 101 marginal analysis, where we determine the equilibrium market interest rate by setting marginal cost equal to marginal benefit. We can kick the complexity up a notch by representing the equilibrium market rate using partial derivatives:

Equation B: - ∂(C(q)) / ∂(q) = i

Since q = (M – A)/N, the central bank can clearly control i by controlling [iD, iB] and/or by altering M to an appropriate level, which is in a simple linear relationship with q (in this simplified model). The appropriate level will be determined by the shape of the probability density function perceived by the market, which the central bank needs to estimate. Of course, in the real world, the “market” may not transparently be using such a clean and simple model of probability. The model is a proxy for a more complicated process in the real world, and so certainly central banking is part art-form as well. Nonetheless, the model is helpful, as it helps the central bank estimate the appropriate amount of open market operations and grapple with the expectations of the market. All of that, combined with building a credible reputation, helps the central bank achieve its interest rate target with precision.

For a more complete version of the individual shock model, see this post on Michael Woodford's model.

Appendix: Explanation of the Math behind the Individual Shock Model of the Interbank Rate

This appendix explains the mathematical foundations for the "individual shock model of liquidity management.” All that's required to understand this a basic familiarity with probability and calculus.
Here's the basic mathematical representation of the model:

10/09/2013

Interest Rates vs. Monetary Aggregates: A (Sort of) Brief History of the Monetary Policy Implementation Debate

As I mentioned in a previous post, I will be doing a series of posts on issues surrounding monetary policy implementation. I already broached the concept of monetary policy implementation vs. strategy, an understanding of which I think is critical for delving deeper into this topic, including what’s contained within this post.

Below, I summarize Bindseil’s account of the history of the debate surrounding the optimal approach to monetary policy implementation. Many of the people and ideas contained within will be familiar to those interested in this topic, but the historical arc along which they appear may be less widely known. Bindseil relies on an extensive set of primary and secondary sources. However, these views are not necessarily unique to Bindseil. I’d imagine Post-Keynesians would give similar accounts, and likely have. I’d also emphasize at the outset that this is a 4 page summary of ~34 pages of Bindseil’s work. Bindseil obviously provides much more argumentation, citation, and quotation than I do here. Moreover, this topic is returned to throughout his ~250 page book.

9/26/2013

Ulrich Bindseil: If You’re Interested in Monetary Policy Operations, Get Your Hands on His Book


I plan to do a series of posts covering various aspects of monetary policy operations and theory, including the following topics: