Velocity of money is central to the quantity theory of money, which relates it to the general price level. While the theory motivated countless empirical studies to include velocity as price determinant, few find a significant relationship in the short or medium run. Since the velocity of money is generally unobservable, these studies were limited to use proxy variables, leaving it unclear whether the lacking relationship refutes the theory or the proxies. Cryptocurrencies on pub- lic blockchains, however, visibly record all transactions, and thus allow to measure—rather than approximate—velocity. This paper evaluates most suggested proxies for velocity and also proposes a novel measurement approach. We introduce velocity measures for UTXO-based cryptocurrencies focused on the subset of the money supply effectively in use for the processing of transactions. Our approach thus explicitly ad- dresses the hybrid use of cryptocurrencies as media of ex- change and as stores of value, a major distinction in recently proposed theoretical pricing models. We show that each of the velocity estimators is approximated best by the simple ratio of on-chain transaction volume to total coin supply. Moreover, “coin days destroyed”, if used as an approximation for velocity, shows considerable discrepancy from the other approaches.
Velocity of money plays a key role in traditional monetary economics since having been popularized by  over a century ago. Broadly speaking, velocity of money denotes the average number of transactions per monetary unit within a certain time period. 1 In the quantity theory of money, velocity is related to the price level. While empirical studies frequently apply this concept to cryptocurrencies, surprisingly few find a significant relationship between velocity and prices. We take this discrepancy as occasion to evaluate current approaches to quantify the velocity of money for cryptocurrencies, and propose a novel one.
Until recently, meaningful measures for velocity of cryptocurrencies did not exist, and most studies resorted to proxy variables. 2 Recent years saw first advances to measure—instead of approximate—the velocity of money. In  and  the quantity equation of money has first been considered to measure velocity as the ratio of transaction volume and money supply. In , however, this approach is modified to create a measure handling the change transactions in cryptocurrency systems. While  and later  focused on adjusting the transaction volume in the above ratio, we complement their approach by adjusting the money supply.
Money in effective circulation should be differentiated from money held for long-term investment or speculation. Not only does the total monetary aggregate contain technically dysfunctional money (burnt coins), a major portion of cryptocurrency is stored unused over long time periods (compare  or ). Economists like ,  or  have argued to exclude such funds and focus on money in circulation. To our knowledge,  and  were first to apply this distinction in theoretical cryptocurrency pricing models. Both link feedback effects from speculation and price levels to a reduction of coins in effective circulation. In  velocity of money is explicitly defined as based on the component of coin supply in effective circulation. In this paper, we operationalize this definition for velocity measurement.
In implementing this concept, we make common implicit assumptions explicit. For example, the separation of money into hoarded or circulating depends on the choice of a time window. Tokens can be defined as circulating if moved within the last day, month, year or any other period. The choice of  and , defining money in circulation as the total coin supply, implies an infinite time window. The other extreme might be a very restrictive definition requiring coins to be moved within the period for which velocity is measured. As the optimal time-window might depend on the respective use case, we operationalize a velocity measure for UTXO-based 3 cryptocurrencies as a function of the respective time-window.
Subsequently, we apply our approach to Bitcoin and compare a variety of potential proxy variables to measures characterizing the two extremes of the design space. Measuring the goodness of fit from a variety of perspectives, we show that the most common proxy-variable, coin days destroyed (CDD)4 in the vast majority of tests shows higher approximation errors than the simple ratio of unadjusted, on-chain transaction volume and total coin supply as shown by a series of Model Confidence Set (MCS) tests. As the majority of research opted for CDD, our results might suggest a reason for the unexpectedly missing relation between velocity and prices in most studies.
Our implementation is based on the open-source blockchain parser BlockSci 5. The codebase to calculate the evaluated velocity measures for UTXO-based cryptocurrencies is re-usable and will be openly available after publication. In summary, we offer three contributions to research on cryptocurrencies:
a review of approaches to quantify velocity, 6
novel measures based on money in circulation, and
an evaluation of common approximation methods.
In the early 20th century  spawned an extensive literature in monetary economics based on what he called the equation of exchange, in which velocity played a crucial role. Since this literature does not relate to account for the information blockchains make available, and is reviewed at length by  we restrict our review to the literature on the velocity of cryptocurrencies. Both theoretical pricing models and empirical studies of price determinants have addressed velocity.
Empirically, cdd is commonly used as proxy variable in regressions of cryptocurrency return patterns. Based on the quantity equation, these studies expect a significant positive relationship between prices and their chosen proxy. While  and  confirm the hypothesis, more often it is rejected . Following ,  estimate velocity as the ratio of adjusted on-chain transaction volume to the total Bitcoin supply when modeling the bitcoin price. Additionally they employ the ratio of off-chain transaction volume and coin supply (both denominated in USD) as a velocity estimator.
Theoretically,  use velocity as central building block of their pricing model. They decompose the velocity of money into a part for monetary units used as media of exchange and a part for those used as long-term investment. The paper does not specify, however, how this decomposition could be implemented. Our paper is the first to offer an operationalization for UTXO-based cryptocurrencies.
In  a measure of velocity of Bitcoin acknowledging the need for adjustments addressing the transaction volume generated by change transactions is presented.  adopt the same concept but provide deeper insights into its technical configuration as a byproduct of introducing a new blockchain parser for UTXO-based cryptocurrencies.
Recognizing the need for a more precise method,  proposed a cryptocurrency’s turnover as derivation of cdd. We compare this approach to the other methods and show that, compared to cdd, the measure is indeed closer to the velocity estimates in many tests.
At its core, velocity of money refers to the average number of turnovers per monetary unit within a period of time. This definition stems from the transaction form of the quantity theory of money as formalized by .7 The central equation of the theory equates the flows of real transactions, given by the scalar product of prices and transaction volumes , to total money flows, equal to the product of the money supply and its velocity where denotes the time period considered. With this amounts to
The scalar product on the right-hand side is referred to as the price sum. In this product, denotes a vector of prices of transacted goods and services in transaction during period . Transaction volumes are given in units of goods and services. They are conceptualized as the vector with volume in transaction in period . On the left-hand side, stands for the number of all units of money supply available in period . denotes the velocity of money.8 While , and are measured over a time period, is a point-in-time measure. To simplify, we assume the money supply is fixed during period and record it at the period’s beginning .
To develop intuition for velocity , it can be viewed as weighted average number of turnovers of all monetary units . The weights are derived from sorting the units into groups with respect to their number of turnovers during period . Velocity then is
with monetary units in group in period . Velocity thus is the sum of turnover numbers , weighted by their respective fractions.9 While this definition is intuitive, it cannot be used to measure velocity in practice. Turnover numbers per monetary unit are neither recorded for fiat currencies, nor can they be inferred unambiguously for UTXO-based cryptocurrencies (compare ). In practice, the velocity of money is thus backed out of : .
For cryptocurrencies, this appears a simple task, using on-chain transaction volume and total coin supply. However, due to their technical implementation, transaction volumes recorded on-chain are distorted. The next section therefore discusses the relevant subtleties of UTXO-based cryptocurrency systems.
Bitcoin builds its transaction graph chaining transaction outputs. This approach is followed by many altcoins, referred to as UTXO-based cryptocurrencies. UTXO refers to the “coins” that can be spent—unspent transaction outputs. 10 Thus, UTXO-based cryptocurrencies record only transactions, in contrast to account-based cryptocurrencies which store a balance for each address on their blockchain. As an extensive exposition is provided by , we restrict ourselves to features relevant for calculating velocity measures.
UTXO-based cryptocurrency protocols ensure an ordered transaction history using a linked chain of so-called blocks (compare Figure 1). These blocks contain a hash (BlockHash) fingerprinting the information of all transactions recorded in the block, a timestamp (BlockTime) and the hash of the previous block (PrevBlockHash). This constellation of hashes and timestamps establishes pointers that are determining the order of blocks. The process of creating new blocks, called mining, creates new monetary units in so called coinbase transactions. Creating blocks usually involves solving a computationally demanding puzzle (proof-of-work) or proving stake in the existing coin supply (proof-of-stake).
Generally, transactions contain a hash as identifier (TxHash) as well as inputs and outputs. Coinbase transactions are exceptions as they include an output but no input. This effectively increases the amount of spendable outputs, and thus money in the system. Inputs are recorded as links back to outputs of a previous transaction identified by an index (PrevTxHash). Outputs can be sent to addresses (Address) corresponding to public keys of an appropriately generated public–private key pair. Unspent previous outputs can be used as inputs only upon proving ownership (Signature). Generating this proof requires the private key belonging to the public key which received the output. Importantly, transaction inputs can only be spent as a whole. If a fraction of the input is to be retained, an additional output must be included which links to an address belonging to the spender. Such addresses are known as change addresses and we refer to the respective outputs as change outputs.
We refer to all outputs sent to an address controlled by the sender, following , as self-churn. Since the concept of user identities does not exist in UTXO-based cryptocurrencies, there is no direct way to clearly separate self-churn from outputs transferred to third parties . Moreover, no relation between elements of the set of inputs and those in the set of outputs of a transaction is determined. In other words, no technical link between individual outputs and inputs exists; unspent outputs are fully fungible. Therefore, velocity measures cannot be calculated by following “coins” or UTXOs through their transactions: due to splitting and rejoining within blocks there exists no “path” the money took.
As discussed in the last section, by construction many cryptocurrency transactions contain outputs sending a fraction of the value back to the sender. Such change outputs as well as other self-churn ought to be excluded from transaction volume: “What is desired is the rate at which money is used for purchasing goods, not for making change .”
While the transaction volume in principle can be calculated accumulating the output values 11 of all transactions recorded within period this yields an inflated aggregate when defined as
with the set of all outputs of transaction in period and the set of all transactions recorded within period . Thus, this volume needs to be adjusted. Defining as the set of all self-churn outputs, the accumulated transaction volume from these outputs can be obtained by summing the individual self-churn outputs as
Hence a corrected transaction volume can be calculated as .
Note that in practice we only observe the above transaction volume in terms of monetary units rather than the full vector of prices and transacted units.
As discussed in , only addresses but no identities are recorded in the transaction ledger. This complicates the classifications of self-churn, needed to calculate adjusted (deflated) transaction volume. However, statistical properties have been used to classify outputs as likely belonging to the same individual user as the transactions inputs (compare ). While heuristics, such procedures are now commonly employed to create user clusters of addresses. Outputs are classified as self-churn if the cluster of their destination address equals the cluster of their input addresses.
As our empirical analysis builds on a blockchain parser proposed by , we follow their choice of heuristics. They employ one heuristic first proposed by  and one accounting for peeling chains. Peeling chains are transaction patterns where large unspent transaction outputs are split into smaller amounts in a chain of transactions. Upon manual inspection  concluded that outputs created and spent within a relatively short time period often belong to the same user cluster. The heuristics used are thus:
All inputs in a transaction stem presumably from one person. 12
Outputs created and spent within 4 blocks are classified as self-churn transactions.
The quantity equation [eq:fisher] requires a measure of the money supply To our knowledge, all prior work has employed the sum total of ever-mined coins as its measure. We denote this money-supply measure and calculate it at the beginning of each period . 13 In all common UTXO-based cryptocurrencies it is a deterministic function of block height. Technically, can be calculated as the aggregate of outputs from the set of coinbase transactions belonging to the set of all periods with a maximum block time smaller and thus
Based on the quantity equation [eq:fisher], the simplest measure of velocity arises from dividing total transaction volume by total coin supply , which was described in  and adopted by . Formally,
offers the advantages of providing a theoretically sound interpretation and extremely simple calculation. Moreover, data for calculating (raw on-chain transaction volume and total coin supply) are widely available. 14 However, the result is biased: Self-churn transactions lead to an overestimation of transaction volume.
 and  propose a similar velocity measure. However, they clean the price sum from self-churn and get
In line with , both measures can interpreted as the turnover of coins during period averaged over the total coin supply.
While these measures advanced quantifying the velocity of cryptocurrencies, they suffer from inaccuracy as money supply is defined as “the aggregate of all monetary units ever issued” (). We therefore propose a measure based on the component of money that circulates effectively. Denoting this circulating amount , our measure yields
To see how excluding hoarded money relates to the quantity theory, expand the sum in and differentiate the set of all monetary units treated as investment, , from its complement :
By definition, encompasses only non-transferred units, and thus in period . Consequently
Hence, the measure can be interpreted as the average number of turnovers of units effectively circulating in period . Dropping non-circulating money in amounts to an adjustment of the money supply in the quantity equation [eq:fisher].
An advantage of basing velocity on money in effective circulation is higher information content. To begin with, it is questionable whether and capture more information than transaction volumes , respectively . After all, for most UTXO-based cryptocurrencies money supply is just a simple function of block height.15 Thus, the two former measures appear very close to merely scaled versions of their price-sums.
Moreover, the total coin supply used in and includes money that is technical dysfunctional (burnt coins). Yet also coins held unused as storage of wealth16 or speculation17 do not fulfill one of the key functions of money, use as medium of exchange.
Furthermore, the amount of money frozen in speculative investments might not be neutral to money flows or prices. Since the beginnings of monetary economics, currency speculation has been associated with patterns in price levels. In  and , the illiquid component is considered a reservoir for neutralizing demand shocks and excluded from the money supply. For  and  hoarded money, as destroyed money, is leakage which must be compensated to stabilise the price level.  associates the rise in market prices for one of the early fiat U.S. bank notes with a relation of speculation and circulating money as well: “speculation acted as a regulator of the quantity of money.”
Our proposed measure captures precisely this velocity of money in effective circulation. This perspective is already present in current theoretical research on cryptocurrency prices. For instance, the model of  distinguishes demand for transactions from demand for rational speculation. When using the quantity equation, they deduct coins bought and held as speculative investment. Their modified quantity equation relates the exchange rate between fiat and cryptocurrency to the velocity of cryptocurrency in effective circulation and the volume of transactions denominated in units of cryptocurrency as
The speculators in the model purchase cryptocurrency units if traded prices lie below their risk-adjusted, discounted expected future price. With increasing aggregate speculative positions, the risk of marginal speculative investments rises, lowering the price speculators are willing to pay. On the other hand, higher speculative positions imply reduced and increase current price according to their modified quantity equation. A similar relation between money in circulation and speculation is modelled in  as well.
Hence, implicitly or explicitly theoretical research already employs velocity based on circulating money rather than the total money supply. We close the gap in empirical research by operationalizing the circulation-based velocity measure for UTXO-based cryptocurrencies.
Based on the concept of velocity of money in effective circulation as clarified last section, we now propose algorithms to calculate velocity for UTXO-based cryptocurrencies.
Conceptually, a monetary unit has circulated in a period if and only if it has been used as a medium of exchange. Therefore, first a time window must be specified with respect to which monetary units are to be classified as circulating or not. Money is referred to as circulating if it has been moved economically within the last day, month, year or generally any time period covering .
To identify the fraction of the money supply which circulated within , we step trough every transaction recorded in period . Transactions spending outputs generated before as well as outputs from coinbase transactions (seignorage) have the interpretation of bringing an amount into circulation that corresponds to the value of spent outputs. All inputs referring to UTXOs generated within period , on the other hand, re-spend money which has already been counted as circulating. 18
Note that we define the time window within which spent coins are considered as circulating distinct from the period for which the velocity measure is calculated, . This distinction can be parametrized via a maximum length of the look-back window , where and .
The approach as characterized so far, however, does not account for two technical properties of UTXO-based cryptocurrencies: First, transactions always spend prior transaction outputs in full. Second, there exists no attribution of individual outputs to the inputs of a transaction; thus it remains undefined which input corresponds to which output(s).
The first property raises the question how to deal with transactions which send back change: Should the sum of all inputs be considered in circulation, as technically all was transferred—or should only the fraction sent to third parties be considered? We analyze both choices, naming the first wba and the second mca. They are visualized in . The moved-coin approach considers only output of transaction as circulating, not the change output. This approach captures the net economic value transferred to a third party. The wba classifies the whole input of transaction as circulating. This approach captures the amount of money that has been moved technically; it can also be interpreted as revealed to be available for transactions.
In the mca, the sum of inputs is counted as circulating only net of change outputs. However, due to the second property of UTXO blockchains ambiguous constellations can occur: If for a given transaction one input was generated within and one before , it remains unspecified which one corresponds to the change output.
Transaction in Figure 2 illustrates the point. It has two inputs: originated before period and originated within . If only amounts sent to third parties matter, it remains unclear which of the two inputs funded the change output. For , the transaction would not increase money in circulation. In contrast, if funded the change output, the transaction would increase the amount in circulation by .
To resolve the ambiguity, an assignment rule between transaction inputs and outputs is required. We consider both endpoints on the spectrum of the age of the input assigned to the change transaction and thus differentiate between lifo, where oldest inputs get assigned to outputs first, and fifo, where it is the other way around.
Naturally, with the wba this differentiation is void. Hence, we have three definitions of money in circulation, each a function of the activity window length : Money in circulation for period adopting the wba (), and both the mca with the lifo rule () and the fifo rule ().
Based on the above definitions of money in circulation, three variations of velocity can be calculated in accordance with Equation 8—one per money aggregate. All measures capture the average number of peer-to-peer coin turnovers of effectively circulating monetary units in period . They concur in capturing on-chain liquidity; they differ w.r.t. to definitions of circulating monetary units and assignment rules linking transaction inputs to outputs. The first measure is based on and simply calculated as
is recommended if a conservative measurement of coin turnover is sought or additional assumptions about how to link inputs to outputs should be avoided. The second and third measures are based on and , respectively:
and are more stringent on the definition of money in circulation. Their monetary aggregates do not count “touched” funds, but only amounts transferred to somebody other than the sender. However, this comes at the cost of the additional assumption with respect to the assignment rule between transaction inputs and outputs.
Having defined three velocity measures, we now detail our technical approach and the implementation.
Money in circulation under the wba is measured as in Algorithm 1. For every period , we loop over all transactions and add their inputs to circulating money if they either reference outputs from coinbase transactions, denoted by , or outputs with timestamps before the first timestamp of period .
Measuring money in circulation under the mca is depicted in Algorithm (2). As in Algorithm (1), for time window we loop over all transactions and add inputs based on the same core condition (compare lines [algo:code_mcirc_mc]-[algo:code_mcirc_mcCond] and [algo:code_mcirc_wb]-[algo:code_mcirc_wbCond]). This time, however, only those inputs are regarded for further counting, which add up to the amount sent to third parties. Therefore, the calculated amount of money in circulation per transaction can be less or equal to the amount sent to third parties, but never more. The order of inputs to consider is determined by the lifo or fifo principle. For every transaction in time window , the amount sent to third parties is determined net of self-churn as where denotes non-self-churn outputs of transaction . If all outputs are identified as self-churn, and the algorithm continues with the next transaction. If , the algorithm collects input values in a vector ; they are sorted in either ascending (lifo) or descending (fifo) order w.r.t. to the timestamp when the UTXOs were generated. Then, looping over inputs , input values are added to if they meet the core condition (compare line 17) introduced in line 6 of Algorithm (1). 19 However, one additional condition applies: If the last added input would increase the summand beyond the value of outputs sent to third parties , we only add up to the latter amount.
One important concern with any measure of economic activity asks, how amenable is it to manipulation? In the case of velocity, the question is specifically if the measure can be inflated by a single agent (or a small group) of limited means. After all, one proxy variable for velocity has been designed as a manipulation-proof alternative to turnover (see Section 9.1.1). How easy would it be to create fake velocity in order to inflate our measures?
Indeed, no direct technical impossibility prevents generating transactions affecting Equations (Equation 12)–(Equation 15). Nonetheless, there exist reasonably tight limits to the manipulation potential of our measures, in particular compared to trading volumes on exchanges.
First and foremost, calculating our measure on on-chain transactions puts an upper limit to fake transfers. First, a fake transaction needs to be committed to the blockchain, and thus can be repeated only once per block time. As long as the manipulator is not the miner confirming the next block (a random and unlikely outcome even for large pools), the on-chain settlement is also costly, incurring fees.
Second, the manipulator must fully fund her fake transactions: she cannot send more than she owns once per block. The question then turns to how she should structure the fake transfers. Send the entire amount in a single transaction, or split it up into as many as can fit into a block? In the latter case, the fees rise. In the former, should she minimize or maximize the number of outputs? If she maximizes, the fees rise again, and she quickly fragments her wealth across a diverging multitude of wallets, reducing the funds available in each for the next round (block) of manipulation.
In this context it is critical that we follow the literature in clustering user addresses: As long as the clustering works, the manipulator cannot combine her funds ever again, lest she would be deconspired as a single agent and all her fake transactions disregarded. It follows that her strategy minimizes the number of outputs, generating a chain of fresh addresses across which a large sum, tied up in the manipulation, traverses indefinitely to make-believe “newcomers.” This, however, matches peeling transactions, which we also exclude. She would thus need to break frequently enough in order to escape this classification, slowing her manipulation further.
In sum, manipulation attempts are both technically elaborate and relatively straightforward to detect and exclude in the future. We thus do not consider manipulation a serious concern for our current results, nor for our measures.
Equipped with our proposed velocity measures that are calculated on a level of detail of each output in each transaction in each block on the blockchain, we can now use these measures to empirically assess the quality of popular proxies for velocity.
To this end, we first review the most common proxies in , then implement the proposed estimators to calculate velocity measures for Bitcoin in , and finally run tests in to evaluate the goodness of fit of the proxies w.r.t. the measures.
Since turnover can be gamed via repeated transfers of an agent to herself,20 the measure was introduced as a manipulation-proof alternative in bitcointalk.org in 2011.21 For each input, coin days refer to the product of its monetary value and the number of days “since it was last spent,” i.e. how many days the funding output had remained a UTXO. then is defined as the sum of coin days over all transactions within a period :
with denoting the number of days since the respective input originated as output (or coinbase transaction) in a prior block, and the value of input of transaction in period .
The measure puts larger weight on the reactivation of long-dormant coins compared to ones frequently spun, a feature clearly differing from the concept of velocity.
In  a proxy for velocity, aiming at “the average number of times the actively used [coins] can be expected to turn over” was proposed. While named “turnover” in , this is not to be confused with the classical turnover that refers to the simple sum total of transacted amounts within a period :
The measure in  in contrast is constructed as the inverse of average dormancy multiplied by the time period, where dormancy amounts to cdd scaled by the sum of inputs spent during the period :
Here refers to the length of the period for which turnover is calculated. To illustrate the concept, assume the coins spent today stayed unused on average for 6 hours before their transaction. Hence, circulating coins are turned over times per day on average. Dormancy-based turnover, however, remains an approximation depending on transactions distributed homogeneously over time.
We collect a dataset spanning from June 2013 until June 2019, starting with the rise of the first cryptocurrency exchanges and thus reliable trading data.
For the proxy variables, we tap existing sources as far as possible. CDD data is gathered via API from Blockwatch,22 trading data taken from CoinMarketCap.23
To calculate the velocity measures, however, access to the atomic units of cryptocurrency transactions is needed. We rely on the open-source blockchain parser BlockSci introduced by . We also provide the code of our paper at https://github.com/wiberlin/ccurr_velocity/. In replicating the clustering approach of  (see ), in order to mitigate the amount and effect of false positives we exclude one unreasonably large cluster with over 297 million addresses.24
We calculate our velocity measures based on a time window for money in active circulation equal to period . 25 Therefore, a coin is in circulation if it is transferred at least once within the day for which velocity is computed. This daily measure can be interpreted as average turnover of monetary units which are part of the daily circulating money supply. While our choice for the window within which moved coins are considered actively in circulation is short, this allows us to flesh out the difference between velocity based on the total money supply and our proposed measures based on circulating supply most clearly. Moreover, since  provide results with an implicitly26 infinite , we provide evidence on the opposite endpoint on the spectrum. Naturally, neither the BlockSci parser nor our approach are restricted to these choices and can be extended both to other time windows and other UTXO-based cryptocurrencies.
Table (1) shows descriptive statistics for the proxies and the measures. denotes cdd in million coin days, while denotes active turnover in expected on-chain coin transfers. For both proxy variables, means exceed medians, suggesting outliers in the skewed distribution. According to , a monetary unit of the total coin supply is turned over times on average, while the more sophisticated measure results in turnover of . The difference stems from the deflated transaction volume used by being strictly lower then the inflated one (see ). According to the measure , which is based on the wba, coins in effective circulation during the day reach turnover of around . Assuming the clustering heuristics work well, coin transfers correspond to peer-to-peer hops. Accordingly, estimates that monetary units in circulation change owners times per day, while and give an estimate of around peer-to-peer hops. The reason for the higher turnover estimate is the more conservative operationalization of the concept of being in circulation (see Section 8.3).
The different levels can be disentangled by looking at the components of the velocity measures. Figure 3a exemplarily shows this for the components of and . While increases steadily over time, the subset of coins transacted at least once per day, with an average 1.5 % of the total supply, is minuscule but volatile in comparison.27 The deflated on-chain transaction volume varies widely—clearly always below total supply, yet above the supply in circulation. While the volatility in the transaction volume feeds fully into , for the relation is less obvious. In Figure 3b, the components’ co-variation with bitcoin price indicates that not only deflated on-chain transaction volume , but also the monetary aggregates are positively correlated with price changes. Somewhat surprising the velocity measures show a slightly negative correlation with prices however.
Comparing maxima and minima across time series, as shows, requires scaling. We use two methods: normalization and standardization. Normalization is based on the usual . Standardization is based on Z-scores with mean and standard deviation . Both are sensitive to outliers , so we truncate at standard deviations around the mean.28 Figure 4 shows the time series of proxy variables for velocity. The scaling leads to a visible difference, relevant when comparing proxies to measurement results. A first indication of the quality of the proxy variables is their diversity. Not only spikes but also general trends vary across methods. Figure 5 depicts the different velocity measures. Differences across measures are smaller, highs and lows correspond more. The next section provides quantitative evidence.
We now turn to evaluate the proxy variables for velocity based on our estimated measures. In so doing, we include the trivial velocity measure in the set of proxies, as it is calculated on very high-level inputs (raw on-chain transaction volume and total coin supply) that are similarly easy to obtain as cdd or turnover.
In order to evaluate the quality of proxy variables perfectly, the “true” velocity ought to be known. At the same time, we have proposed three novel measures, in addition to those suggested by  and . We do not view this a contradiction: after all, the measures capture different concepts of velocity. Ours, for example, address the velocity of money in circulation. Therefore, we do not horserace all proxies against one benchmark, but rather evaluate the goodness of fit of the proxy variables with any of the measures. We do, however, take the position that the measures are more precise estimators of “true velocity” (its respective concepts) than the proxies, and thus evaluate the latter in terms of their fit to the former.
Perhaps surprisingly, the results do not vary qualitatively much across different measurement approaches.
A common approach assesses approximation errors by comparing mean squared errors or mean absolute errors. MSE, by squaring them, punishes large deviations more rigorously than the linear MAE. We provide both, not only to the standardized and normalized time series, but also on their first differences. This is motivated, as in econometric studies like those in Section 2, by doubts about the stationarity of the time series.
Table 2 shows that the above transformations differ in their assessment of the goodness of fit of approximation methods. When judging by the normalized but undifferenced dataset, turnover (as alternatively defined by ) achieves the lowest error in approximating velocity measures based on effectively circulating money supply. However, for all other constellations we find that the trivial measure provides the closest fit to all ways to measure velocity.
To understand if the approximation methods indeed differ significantly, we perform Model Confidence Set (MCS) tests.  introduced tests to measure the performance of forecasting methods, but they are applicable more generally. The method uses equivalence tests and an elimination procedure to determine which set of models significantly outperforms the rest of the models. For an intuitive understanding of the test one might think of a set of models (here the different approximation methods). The models are indexed using . When comparing model to a benchmark, for each period model leads to loss functions , where denotes the benchmark velocity measure and the approximation method .
To compare the performance of the models in , relative performance metrics are defined as
for all where . We use absolute and squared errors for the loss function , so that it takes the forms
The relative performance of model compared to all other models then is
with . The null hypothesis states
If it can be rejected for set , there exist models in the set that are significantly outperforming the remaining ones. The above equivalence test is then re-iterated after elimination of the worst-performing model(s) until the null hypothesis cannot be rejected. For details, see .
Table (2) displays the MCS tests’ results, denoting significance at the 1 % level with . Results for the differenced dataset are clear: the MCS tests select even at the tight 1 % significance levels a single winning approximation method for each constellation. This evidence confirms at high significance levels. is the single best element of the 1 %-MCS for all constellations. For the normalized undifferenced dataset the turnover is the single best proxy for velocity based on circulating money for all but one of the constellations. If velocity is measured as , however, the trivial measure again performs significantly better than all other proxies. In summary, MCS tests mostly support the trivial measure . Only in cases with time trends and less focus on outliers than on smaller changes, turnover might be the better choice. Again, , the most common proxy, is significantly outperformed in all constellations.
The Mincer-Zarnowitz (MZ) approach regresses estimates on the benchmark in simple ordinary-least-squares regressions . In order to avoid spurious regression results, we only use the first differences of the standardized and normalized series. The regression structure for proxy can be expressed as:
An ideal proxy would yield an intercept of 0 and a equal to 1 with an adjusted of . The results of the MZ regressions in Table 3 confirm the simple ratio of inflated on-chain transaction volume to total coin supply as superior. This simple ratio yields an adjusted between and ; other proxies’ is much lower for all measurement methods of velocity. While for none of the approximations a significant intercept indicates bias, the slope coefficients for most constellations are significant and positive. Furthermore, with respect to the coefficients, performs best. The latter shows coefficients between and for normalized and between and for standardized data.
The MZ regressions are in line with the evidence in prior sections. Again, the trivial measure approximates the more sophisticated velocity measures better than the two commonly used proxy variables.
We analyzed approaches quantifying the velocity of money for cryptocurrencies; moreover, we introduced novel measurement methods based on money in effective circulation.
Our implementation shows that velocity as a function of the period within which money is considered as effectively circulating can be more informative than prior velocity measures. Our results also raise questions for future research: Is velocity for certain time spans more related to price? If so, can such a relationship be exploited in the construction of stablecoins ? What is the effect of including off-chain transactions?
In addition, we analyze goodness of fit for common velocity approximations. In most tests we find that the common proxy variable coin days destroyed delivers higher approximation errors than the simple ratio of unadjusted, on-chain transaction volume to total coin supply.
On a broader scale, by publishing our code we hope to foster research on the economic properties of cryptocurrencies.