Implied volatility calibration

Option prices can be modeled in many various ways. One of the most standard one is to take future evolution of price of underlying as stochastic martingale process, apply contractual payout(s) and then take an average of total payoffs occurred under simulated scenarios. In some cases it is possible to deliver the answer analytically. However, the resulting pricing model, like the simplest Black-Scholes (BS) model, will be far from reality. BS-model assumes instant price evolution from t to t+dt to follow normal distribution with constant volatility. This assumption was tested many years ago (1994, E.Derman and I.Kani) via special procedure. In this, the algorithm searches for the volatility corresponding to the price observed on the market.

Such procedure requires (for equities):

  • option contract specifications (strike, maturity, currency, stock id)

  • price of underlying (reference number usually fixed via third party information. Mostly it is public exchange),

  • dividends of underlying

  • Interest rate curve derived from money market products (OISwaps, IR futures, deposits etc.). This curve is needed to discount future cash flows.

The synthetic forward of underlying constructed with Call-Put prices as a function of strike can be used directly in the procedure. In this case, there is no need for IR and dividend information.

Analysis of freight market

Movement of goods and commodities within a country and cross-border is executed by logistics. It involves multiple parties: producers, sellers, shops, marketing (including e-commerce), logistic and trade brokers, owners of vehicles, trade financing (banks), insurance, contract and arbitrage lawyers. So, in order to deliver the product from A to Z, one have to take into account many variables before the product can be delivered from producer to consumer. Often this delivery is done by intermediaries, who either match demand with supply for a fee and/or try to make profit by buying and selling goods with their own balance sheet. Then, intermediaries have to know or predict the where the market will move. This involves the understanding of price dynamics of not only goods but also of logistics itself, because if intermediary does not own the vehicle, he/she has to rent it from vehicle owner or manager.

All that involves analysis of:

  • production cost (at EXW: incoterms)

  • alternative freight modes and routes (shipping, trucks, rail, airplanes and warehouses, if storage is involved)

  • trade financing (if intermediary wants to leverage his/her balance sheet)

  • insurance including possibilities to hedge currency and freight/delivery risks

  • demand of goods versus its availability on the market

Therefore, it is important to be able to model these components in order to make reasonable predictions.

Pricing set-up, exotic derivative products (example: Equity)

Here is the list of steps to price the product. It is done as a memo to myself and to help others.

  1. Define feed and historical data of:

    • Stocks/Indices

    • Options. Typically, american-style options are used for stocks, european-style options are used for indices.

    • Interest rate curves.

      • One needs OIS as a basis to define funding cost. Some banks take collateral rate as a measure. However, the most correct value will be some blend of all funding sources. Such blend can follow WACC (Weighted Average Cost of Capital) methodology or more complicated method, where one can use weighting by volume. To be discussed further.

      • If above is yet difficult to determine, one can use Libor (remember, they are to be replaced!). This will grossly overestimate the cost.

    • If product is "quantoed" (conversion of payoff at maturity or similar), then you need also

      • currency feed

      • interest rates (OIS, IBOR, see discussion about funding cost above) per currency and

      • in some cases feed cross currency basis swap quotes/rates

  2. Store data to DataBase. Choice of DB depends on the purpose of the application. We recommend:

    • sqlite. This is serverless DB. Which means that it has no concurrency writing, but can serve multiple reading processes. It is fast and compact. Sometimes people store into multiple files, so concurrency is implemented outside. This one is often used in high frequency trading, sometimes as intermediary intraday data staging DB.

    • MySQL (some prefer Postgres, but that's not my favorite). MySQL has Express version and Workbench. It can handle concurrency and be used as backbone of some server based web-site or (mobile) application

    • HDF5 is another DB which is used to store really huge amounts of data (e.g. PBytes). No SQL is needed. Data are stored into linear tables with file-type structure. Many people use it to store time series or to train AI/ML's. The latter is specifically useful because of fast reading of (any-size) chunks of data. Special design is required to make it really fast.

    • Hadoop. It is a collection of open-source distributed applications including DB.

  3. Calibrate volatilities:

    1. Risk-neutral measure (extract of future information about underlying dynamics from currently published prices or instant information):

      • Local vols are from american options. These are "instantaneous" volatilities. Remember, that they can be used in Monte Carlo when simulating paths from "now" to the "future". During the simulation, when path is at some time-price point the respective volatility should be selected to generate next point.

      • Implied vols are from european options (or can be approximated from american options). These are "integral" of local volatilities from "now" to the maturity of the european option. These cannot be used in Monte Carlo as it still has to be converted into local volatility.

      • There are transformations between implied and local. For example, see Fengler "Semiparametric modelling of implied volatility" for details.

      • Implied volatility can be extracted from option prices in two ways:

        • inverting Black-Scholes formula for european options

        • using Dupire formula. For this to work, one has to implement some interpolation of european option prices along the strike.

        • using approximation pricing formulas for american options.

      • Also, it is possible to imply p.d.f. of underlying from option price at maturity time. Read Breeden, D. T. and Litzenberger "Prices of State-Contingent Claims Implicit in Option Prices" fro the idea.

    2. Real-Word or Physical measure (extract of future information from information available from the historical prices):

      • Volatilities can be extracted with econometric types of models.

    • Notes: When using volatilities pay attention to which period of time it is referred. For example, implied and local volatilities are quoted typically over the year horizon, while econometric models deliver volatilities relevant for the time interval used at their input. Therefore, in this case scaling is needed.

    • Interpolation of volatility helps to create "surface" (for equities) and "cubes" for (interest rate products)

  1. Calibrate stochastic model. It describes evolution of underlying in the future with Stochastic Differential Equation (SDE). Typical choices:

    • Local volatility is calibrated to local volatility model like Heston or SVI (Stochastic Volatility Inspired)

    • Implied volatility is described with implied volatility model like SABR

    • It also can be based on the definition of probability space, for example, by using Markov Chain approach or Fokker-Plank equation.

    • Econometric model can be used as well. In some case it can be very convenient one.

    • Tip: There is no universally good model. Most of time analyst has to look into data and to select the best one.

  2. Pricing

    • Many more simple products have closed-form solutions. Before inventing "the wheel", search it.

    • Price of derivative is equal to an expectation (average) of payoff given the range of scenarios provided by chosen stochastic model. It is usually calculated in two ways:

      • Monte Carlo with full path generation, where some tricks apply to speed up the solution, like variance minimization. Longstaff-Schwartz method also helps to reduce calculation time.

      • Payoff is an application of certain formula at boundaries of path space generated by SDE. This can be solved by applying finite differences schemas as approximations of SDEs.

Validation of algorithmic trading

Given the interest about algorithmic trading validation coming from regulators here is the (non-exhaustive) list of items to be checked when this activity is validated.

List of input requirements:

PRA - Consultation Paper | CP5/18 , "Algorithmic trading". February 2018

PRA - Supervisory Statement 5/18 , Algorithmic trading

MiFID II - Algorithmic trading - AFS summary

FCA : Algorithmic and High Frequency Trading (HFT) Requirements

Acronyms and terminology:

MTF - Multilateral Trading Facility

OTF - Organized Trading Facility

HFT - High Frequency Trading

DMA/DEA - Direct Market Access / Direct Electronic Access

PnL - Profit and Loss

Trader - owner of trading system

Algorithm - model interpreting the market and own performance

System - embeds market, own portfolio views, algorithm and order execution module

Validation of algorithmic trading is an exercise to prove integrity of the following items:

  • Stress resiliency of the algorithm (protection against singularities or internal self-generating order flow) with respect to different scenarios:

    • simulation of quote flooding

    • physical disconnection

      • develop sequence of actions

        • check risk reports

        • focus on biggest risks

          • close position or

          • hedge

        • turn to alternative connections, brokers

  • Compensating controls

    • Pre-trade control includes estimations of:

      • profit margin (alternatively distribution of)

      • fee

      • risk margin charged by exchange

    • Kill-switch (Red button, ) is a combination of

      • hardware based blocks (circuit breaker disconnects from the market)

      • software based blocks (control over single order gate, like order flow internalizer)

      • foreseen implementation of post-disaster scenarios, like

        • shut down all orders outflow

        • recall/cancel all orders outstanding

        • neutralize unnecessary exposure (to the best knowledge of the system)

      • develop and test the system against such post-disaster scenarios:

        • What-if all market goes all UP/DOWN,

        • All orders move away from the mid-price

        • etc.

  • Backtesting algorithm:

    • portfolio (PnL) performance

      • price prediction

      • order flow prediction vs realized LOB dynamics

      • order execution vs market reaction

    • system performance due to:

      • latency

      • memory capacity

      • internal responsiveness

      • algorithm performance during periods of stress

        • requires development of time/flow-pressure simulator

        • algorithm may work differently under sequential tick injection and under time pressure, if/when speed has priority over smart result

  • Feed integrity (completeness and reliability of available information)

    • how many messages are lost

    • how does it impact the decision flow

  • Risk controls:

    • realtime control of own view vs exchange view over

      • order mismatch

      • position mismatch

    • same for end-of-day settlements (alignment with Settlements)

    • impact of variability of latency (if not co-located, but still even with collocation there might be a problem)

    • when all above is working, check Market risk

  • Resilience of trading systems:

    • reliable connectivity

    • presence of counter-balancing mechanisms embedded into the system allowing for compensation of the damage - so-called feedback mechanisms

Validator has to check algorithm performance keeping in mind that the following is avoided:

  • disorderly market - voluntary or accidental destabilizing impact is equally damaging for the reputation of the Trader

  • market abuse:

    • huge inventory or huge competitive technical advantage may destabilize the market to the profit of the owner of the algorithm. In case of such event it would be difficult to prove the innocence.

    • Intentional abuse of price formation versus behaviour of market players

  • business risk arises if Trader is a member of exchange with obligations to maintain the market and if system performance is non perfect:

    • How must such trader react to failure of his system? Can he fulfill his/her obligations

Ensure that the responsible Trader knows and understands the behavior of the system. All above items requires reflection in policies.

Consolidated Basel-4 framework (Part-1)

BIS has published their Consolidated B4 framework for Banks.

Summary of themes:

  1. Introduce stronger border between Banking and Trading books.

  2. Better alignment between Credit Risk measures withing Trading and Banking books:

    1. Trading Credit Risk was measured over risk neutral measures (i.e. implied from traded instruments, such as CDS, Bonds etc.), while

    2. Banking Credit Risk was simpler, but had more complicated structure. It was a blend of

      1. Ratings coming from major rating agencies

      2. Statistics from the sample of default events

      3. Fundamental information coming from business (market size, accounts etc) of the counterpart

    3. Main difficulty of implementation of such alignment lies in the search of equivalent measure between Risk Neutral and Fundamental/Structured Credit risks present in the Trading and Banking books.

  3. Trading book (Market and Trading Credit risks):

    1. It fixes capital arbitrage problem as a main concern from regulators. In the past, it was possible to shuffle instruments between those books in order to optimize (reduce) capital requirement. FRTB (Market risk) sets two restrictions:

      1. on product definitions, where they can be "hold till maturity" (banking book) or "available for trade" (trading book).

      2. it is not possible to change capital model for those items which changed the book

    2. Capital for Trading book must be calculated with Standardized Approach. It is done for the better and more homogeneous capital benchmark between banks.

    3. Traded Credit risk formerly accounted in IRC (Incremental Risk Charge) now moves into DRC (Default Risk Charge).

      1. IRC included both, credit spread (tradeable diffusion-type series) and default events (jump)

      2. DRC moves capital from default event into banking book.

    4. Cross-border (banking/trading) hedges are disallowed.

    5. Replacement of VaR with Expected Shortfall seems to be not much of the problem.

    6. NMRF (Non-Modellable Risk Factors) are very close to those mentioned as RNIV (Risk Not In VaR) by PRA (UK).

    7. New regulation requires approval at desk level.

    8. Overall, the structure has changed:

      1. Basel-2: Regulatory Market risk RWA was a blend of IMA and Standardized approaches. Regulators encouraged banks to develop Economic Capital to allow regulators to benchmark both numbers.

      2. Basel-3: Market risk RWA is calculated ultimately by SA, while IMA becomes the "new Economic Capital" and will be used for regulatory benchmarking.

  4. Cost of Credit - Expected Loss and CVA (Credit Valuation Adjustment)

    1. Within Basel-2 Expected Loss (EL) was provisioned within annual budget. Any Unexpected Loss was covered from Capital buffer. Trading Credit Risk (also Credit Counterparty Risk) accounted EL similarly.

    2. After and during the crisis of 2007-2009, CVA became important as a measure aligned with other instruments in Trading Book. Resolution of problems related to hedging rules.

Consolidated Basel-4 framework (part-2)

Previous part was about:

  • Border between Banking and Trading books,

  • Alignment (mostly data) problems and solutions of Credit risk measures used in these books

  • Structural changes in Trading book capital measurement

  • Alignment between EL (Banking book) and CVA (Trading book)


  1. Changes in Banking book:

    1. Change in Credit Risk measurement is driven by IFRS9 requirements about classification of portfolio items, measurements (Fair Valuation vs Amortization vs FVOCI (Fair Value Other Comprehensive Income))

    2. Changes from TTC (Through The Cycle) to PIT (Point In Time), which is closer to the Risk Neutral measure used in Trading book). It helps to reduce delay in recognition of asset impairments

    3. (Current) Expected (Credit) Loss (CECL) model with smoother and faster reaction to the state of counterpart. Compare to IAS39, where loss is expected and accounted in pre-default and default state.

    4. With regard to what has been discussed in the previous part, the questions are:

      1. Can we compare CECL and CVA directly?

      2. Hence can we compare PD(PTI) and PD(Risk Neutral)?

      3. Also, can we compare LGDs? Trading book credit related items are mostly governed by ISDA, while Banking book is governed by local legislation and loan agreements with lender (be it mortgage or plain customer loan)

    5. Remark: Changes in BigData industry led to openness of SME financial data, better predictability of economical data and also increased research about predictions of economic activities coming from satellite surveillance (by the way, this is interesting for separate discussion)

  2. Balance sheet structural effects, related to ALM:

    1. IRRBB - Net Income Interest (NII) risk

      1. Main driver of confusion and problem here is the existence of different methods of accounting interest:

        1. accrual as in contract method vs

        2. aligned with IR instruments, like swaps.

      2. This difference in methods is the main reason for the gap.

      3. Consistent simulation of IR-scenarios across entire book can be a source for more optimal resource allocation.

    2. Liquidity

      1. HQLA requirement creates demand for government bonds, which are required to be supported with deposits. Liquidity Coverage Ratio (LCR) implicitly demands equalisation of HQLA with run-out Cash Flows within 1-month, where deposits are the most vulnerable in this regard. Net Stable Funding Ratio (NSFR) is closely related to this ratio

    3. Structural FX risk

      1. This one comes from the necessity to protect capital adequacy ratio at the aggregated level (denominated in base currency) from changes of capital in branches (denominated in other currencies) due to movements in foreign exchange rates (see "Minimum capital requirements for market risk", jan2016, item 4, page 5).

  3. In summary, overall balance sheet optimization must be done within the following regulatory constraints:

    1. Capital adequacy ratio

    2. Leverage ratio

    3. LCR and NSFR

Stress testing

Some thoughts aloud.:

Bank employ rich set of scenarios to test resilience of portfolio and to deliver various measures of risk. These models span from

  • Type-I: scenarios with attached probabilities (weights), like it is used in Value-at-Risk (VaR) or Monte Carlo (MC) type of models or

  • Type-II: stress test based models, where possible market states are scanned in wide range, portfolio performance is inter-/extrapolated and the worse case scenarios are used as risk measure.

However, thinking more about stress tests they should serve as a model independent addition to the usual probability #measure *) based tools (type-I).

Another method is to reverse engineer the risk factors of portfolio in terms of weakest points (no likelihood/probability is attached), but that will serve the purpose to some extent, because this exercise will depend on the specific in-house (pricing) model. Some model independence can be achieved by building market-wide (AI?) model which can be used to detect pockets of instability (~singularity) which scenarios have to be injected into pricing of portfolio.

Discrepancy or consistency between measures used in pricing and risk modelling is similar topic, because if singularity becomes certain (realises) it changes/shifts all pricing (via price of risk, e.g. optionality or #XVA type of modelling items).

IBOR Replacement

#IBOR #replacement by 2021-dec-31:

  • US: Secured Overnight Funding Rate (SOFR)

  • UK: Sterling Overnight Index Average (SONIA)

  • EU: Euro Short Term Rate (ESTER)

  • Switzerland: Swiss Average Rate Overnight (SARON)

  • Japan: Tokyo Overnight Average Rate (TONAR or TONA)

Main care should be taken about products with maturity > 2021-dec-31.

"Younger" products will mature naturally.

ISDA prepared report: Anonymized Narrative Summary of Responses to the ISDA Consultation on Term Fixings and Spread Adjustment Methodologies

Contractual fixing will be on indexed on average of Libor rates recorded a month before the roll-over will take place. It is non-linear transformation and involves Asian feature with fixing in arrears.

Due to directions from FED, ECB, BoE etc there is a hype on #LiborReplacement which is due in 2021. Some argue that this will flow "by itself", similar to the change from national currencies to EURO in 1999-2002, some think that it will be "major disaster".

Indeed, many banking systems are relying on Libor. It starts from pricing and risk models which may use Libor as a core rate. Although such approach has changed since ~2007 when OIS rate was introduced as central rate for modelling, still there might be some entities who use Libor as a central quote.

Libor is often used as a reference rate, e.g. Libor+X%, to price commercial products towards corporate and retail (mortgage) markets. After 2021, all these contracts must be rolled. The change of Libor to the new rate perhaps will be done under zero-profit condition. That has to be calculated. Those who will do the calculations, remember, to account "in-arrears" settlement condition during transition period.

To add few words about causes of #Libor problem and possible solutions:

  • Libor was quoted by few closely-related banks. It was tempting for them to manipulate the rate, so they did.

  • One of the solutions to avoid manipulation is to invite a #thirdparty who will honestly and independently monitor the market and quote it. However, there are famous negative examples, when rating agencies were part of the deal too.

  • Governments take the role into their hands and say that the publicly traded Short-rates will be the new Libor. This is regarded as not the most optimal solution and might turn to be the new handle for corruption or market manipulation when they tried to safe those who are "too big to fail".

  • Another solution can be in hands of #CCP's. By definition, many banks today are obliged to process large portion of vanilla IR-contracts through CCP. Hence, these CCP are able to calculate the all-balancing rate out from the inherent cash flows. For the sake of stability of the market it would be useful to publish aggregated distributions of cash flows within financial system. By the way, CCP is also able to calculate implied contractual rates from these flows, hence it is possible to construct more reliable "new Libor" rate. That will embed an useful informational feedback.

  • Yet another possibility, is to build a distributed (blockchained) register for quotes which will be delivered by all participants. The open-source algorithm will calculate and publish the rate based on the information delivered by participants. The readonly-backdoor can be given to regulators for audit purposes. The design of such system can be elaborated further to ensure stability and avoid manipulations.

Extraterrestrial artificial sources of particles. Case of neutrino physics.

Capabilities of INNOVAEST is in banking, trading, general business (logistics, warehouse, production etc), physics and can be more. We never tried medical data, but why not...

Physics: in Feb20 , I have submitted the article describing idea to put artificial particle sources outside the Earth. The best starting example should be (and will be) Moon. It faces Earth with one side, has low temperatures and is surrounded with deep vacuum.

Three days ago TheNextWeb site has described this problem referring to my article. It calls to NASA, Elon Musk and mentions D.Trump somehow. The idea with lunar accelerator is another nice goal to achieve for humanity. Perhaps, I would love to see it sooner, than later.

PS. Few days later, my friends have sent me these two articles: