PROGRAMME
ABSTRACTS
Thursday, 5 January 2017 

Time 
Activity 
08:30  08:50 
Registration 
08:50  09:00 
Opening Address 
09:00  09:45 
Eckhard PLATEN

09:45  10:15 
Group Photo & Tea Break 
10:15  11:00 
Ying CHEN

11:00  11:25 
SerHuang POON

11:25  11:50 
Seyoung
PARK

11:50  12:15 
Xiangwei WAN 
12:15  13:30 
Lunch 
13:30  14:15 
Stefan WEBER

14:15  15:00 
Xun
LI

15:00  15:25 
Wei JIANG 
15:25  15:55 
Tea Break 
15:55  16:20 
Ankush
AGARWAL 
16:20  16:45 
Chen YANG 
16:45  17:10 
Ariel NEUFELD 
17:10  17:35 
Anulekha
DHARA 
17:35  18:00 
Patrick GE 
18:00  18:25

Junbeom
LEE 
Friday, 6 January 2017 

Time 
Activity 
08:30  09:00 
Registration 
09:00  09:45 
Robert ANDERSON 
09:45  10:15 
Tea Break 
10:15  11:00 
Jussi KEPPO 
11:00  11:25 
Cong QIN 
11:25  11:50 
Budhi
Arta
SURYA 
11:50  12:15 
Hui
SHAO 
12:15  13:30 
Lunch 
13:30  14:15 
Rama CONT 
14:15  15:00 
Alexandre POPIER 
15:00  15:45 
Ioannis KYRIAKOU 
15:45  15:50 
Closing Address 
Principal Component Analysis (PCA) relies on the assumption that the data being analyzed is IID over the estimation window. PCA is frequently applied to financial data, such as stock returns, despite the fact that these data exhibit obvious and substantial changes in volatility. We show that the IID assumption can be substantially weakened; we require only that the return data is generated by a single distribution with a possibly variable scale parameter. In other words, we assume that return is Rt = vt φt, where the variables φt are IID with finite variance, and vt and φt are independent. We find that when PCA is applied to data of this form, it correctly identifies the underlying factors, but with some loss of efficiency compared to the IID case.
Now assume that the scale parameter is set by a continuous meanreverting process, such as the volatility of return in the Heston Model. We use an exponentially weighted standard deviation of historical returns, v ̂_t, as an estimate of vt. It is standard practice to estimate risk measures such as Value at Risk (VaR) and Expected Tail Loss (ETL) from the estimated volatility v ̂t by assuming that the returns are Gaussian. These Gaussian estimates systematically underforecast VaR and ETL in the presence of variable volatility, excess kurtosis, and negative skew. We propose the “historical method,” using the empirical distribution of Rt+1/v ̂t, as a more robust method for estimating VaR and ETL. In simulation, we find the historical method provides accurate forecasts of both VaR and ETL in the presence of variable volatility and excess kurtosis, and accurate forecasts of VaR in the presence of negative skew.
Forecasting Limit Order Book Liquidity SupplyDemand Curves with Functional AutoRegressive Dynamics
Ying CHEN, National University of Singapore, Singapore
Limit order book contains comprehensive information of liquidity on bid and ask sides. We propose a Vector Functional AutoRegressive (VFAR) model to describe the dynamics of the limit order book and demand curves and utilize the fitted model to predict the joint evolution of the liquidity demand and supply curves. In the VFAR framework, we derive a closedform maximum likelihood estimator under sieves and provide the asymptotic consistency of the estimator. In application to limit order book records of $12$ stocks in NASDAQ traded from 2 Jan 2015 to 6 Mar 2015, it shows the VAR model presents a strong predictability in liquidity curves, with R2 values as high as 98.5 percent for insample estimation and 98.2 percent in outofsample forecast experiments. It produces accurate 5, 25 and 50minute forecasts, with root mean squared error as low as 0.09 to 0.58 and mean absolute percentage error as low as 0.3 to 4.5 percent.
This is a joint work with Wee Song Chua (National University of Singapore, Singapore) and Wolfgang Karl Haerdle (Humboldt Universitaet zu Berlin, Germany).
Fire sales, indirect contagion and systemic stresstesting
Rama CONT, Imperial College London, United Kingdom
We present a framework for modeling the phenomenon of fire sales in a network of financial institutions with common asset holdings, subject to leverage or capital constraints.
Asset losses triggered by macroshocks may interact with portfolio constraints, resulting in
liquidation of assets, which in turn affects market prices, leading to contagion of losses when
portfolios are marked to market. If marktomarket losses are large, this may in turn lead
to a new round of fire sales.
In contrast to balance sheet contagion mechanisms based on direct linkages, this price mediated contagion is transmitted through common asset holdings, which we quantify through liquidityweighted overlaps across portfolios. Exposure to pricemediated contagion leads to the concept of indirect exposure to an asset class, as a consequence of which the risk of a portfolio depends on the matrix of asset holdings of other large and leveraged portfolios with similar assets.
Our model provides an operational systemic stress testing method for quantifying the
exposure of the financial system to these effects. Using data from the European Banking
Authority, we apply this method to the examine the exposure of the EU banking system to
pricemediated contagion. Our results indicate that, even with optimistic estimates of market depth, moderately large macroshocks may trigger fire sales which then lead to substantial losses across bank portfolios, modifying the outcome of bank stress tests. Moreover, we show that pricemediated contagion leads to a heterogeneous crosssectional loss distribution across banks, which cannot be replicated simply by applying a macroshock to bank portfolios in absence of fire sales. We propose a banklevel indicator, based on the analysis of liquidity weighted overlaps across bank portfolios, which is shown to be strongly correlated with bank losses due to fire sales and may be used to quantify the contribution of a financial institution to pricemediated contagion.
Unlike models based on 'leverage targeting', which assume symmetric reactions to gains or
losses, our approach is based on the asymmetric interaction of portfolio losses with onesided constraints such as leverage or capital requirements, makes a distinction between insolvency and illiquidity and leads to substantially different loss estimates is stress scenarios.
This is a joint work with Eric SCHAANNING (Imperial College London, United Kingdom).
Opaque Bank Assets and Optimal Equity Capital
Jussi KEPPO, National University of Singapore, Singapore
Banks' assets are opaque, and therefore, we model their true accounting asset values as partially observed variables. We derive a stochastic control model to optimize banks' dividend and recapitalization policies in this situation, and calibrate that to a sample of U.S. banks. By the calibrated model, the noise in reported accounting asset values hides about onethird of the true asset return volatility and raises the banks' market equity value by 7.8% because the noise hides the banks' solvency risk from banking regulators. Particularly, those banks with a high level of loan loss provisions, nonperforming assets, and real estate loans, and with a low volatility of reported total assets have noisy accounting asset values. Because of the substantial shock on the true asset values, the banks' assets were more opaque during the recent financial crisis.
This is a joint work with Min DAI (National University of Singapore, Singapore) and Shan HUANG (National University of Singapore, Singapore).
Advancing the universality of pricing methods for Asian options to any underlying process – applications in financial markets
Ioannis KYRIAKOU, City, University of London, United Kingdom
We propose an accurate method for pricing arithmetic Asian options in a general model setting by means of a lower bound approximation. In particular, we derive analytical expressions for the lower bound in the Fourier domain. This is then recovered by a single univariate inversion and sharpened using an optimization technique.
Our proposed method can be applied to computing the prices and price sensitivities of Asian options with fixed or floating strike price, discrete or continuous averaging, under a wide range of stochastic dynamic models, including exponential Lévy, meanreverting, affine stochastic volatility models, and the constant elasticity of variance diffusion. We present evidence of notable performance and robustness of our optimized lower bound for different test cases.
We then focus on applications in financial markets where the use of Asian options is prevalent, such as energy commodity and freight markets, exploiting the analytical tractability, computational competence and precision of our valuation framework. Asian options are favoured by risk managers being costsaving hedging instruments and due to their ability to mitigate problems relating to market manipulation of the underlying, especially in thinly traded markets. We calibrate different models to market option prices, analyse the competing fitted models, investigate the pricing errors and compare model performances. In general, we find that neglecting important stylized empirical facts leads to nontrivial mispricings.
Selling Financial Assets at the Right Time
Xun LI, Hong Kong Polytechnic University, Hong Kong
This work studies a continuoustime market where an agent, having specified an investment horizon and a targeted terminal mean return, seeks to minimize the variance of the return. The optimal portfolio of such a problem is called meanvariance efficient \`a la Markowitz. It is shown that, when the market coefficients are deterministic functions of time, a meanvariance efficient portfolio realizes the (discounted) targeted return on or before the terminal date with a probability greater than 80%. This number is universal irrespective of the market parameters, the targeted return, and the length of the investment horizon.
Preparing Financial Regulation for Forthcoming Crises
Eckhard Platen, University of Technology, Sydney, Australia
This paper puts up for discussions the potential of new technologies that combine modern longterm hedging strategies with empirical insights from classical financial planning in order to retain or recover solvency of pension funds and life insurance companies in an increasingly difficult market environment. In contrast to other suggestions this would be done without shifting key risks to households and without affecting the stability of the international financial system. The method proposed involves drawdown constraints and diversification under the benchmark approach. For the proposed longterm focused methodology a more general modelling world is considered than permitted under the classical noarbitrage approach.
This is a joint work with Michael Schmutz (Swiss Regulatory Authority (FINMA), Zurich, Switzerland and University of Bern).
Optimal Trade Execution and Backward Stochastic Differential Equation
Alexnadre POPIER, Université du Maine, France
In this talk we present in details an optimal stochastic control problem related to portfolio liquidation problems. Then we explain how it can be solved using backward stochastic differential equation. We generalize the existing results in three directions: firstly there is no assumption on the underlying filtration (in other words on the noise), secondly we relax the terminal liquidation constraint and finally the time horizon can be random. This is a joint work with joint work with T. Kruse (Essen, Germany). At the end we will briefly present two works in progress: one with S. Ankirchner (Jena, Germany), A. Fromm (Jena, Germany) and T. Kruse, the second with C. Zhou (NUS, Singapore).
Systemic Risk, Group Risk, and Risk Sharing
Stefan Weber, Leibniz Universität Hannover, GERMANY
The talk reviews recent developments on measuring systemic and group risk and discusses related results on risk sharing.
This is a joint work with Sipan AGIRMAN (Leibniz Universität Hannover, Germany), Kerstin AWISZUS (Leibniz Universität Hannover, Germany), Zachary FEINSTEIN (Washington University in St. Louis, USA), AnnaMaria HAMM (Leibniz Universität Hannover, Germany), and Birgit RUDLOFF (Wirtschaftsuniversität Wien, Austria).
Portfolio Benchmarking under Drawdown Constraint and Stochastic Sharpe Ratio
Ankush AGARWAL, Centre de Mathématiques Appliquées, France
We consider an investor who seeks to maximize her expected utility derived from her
terminal wealth relative to the maximum performance achieved over a fixed time horizon,
and under a portfolio drawdown constraint, in a market with local stochastic volatility
(LSV). In the absence of closedform formulas for the value function and optimal portfolio
strategy, we obtain approximations for these quantities through the use of a coefficient
expansion technique and nonlinear transformations. We utilize regularity properties of the
risk tolerance function to numerically compute the estimates for our approximations. In
order to achieve similar value functions, we illustrate that, compared to a constant volatility model, the investor must deploy a quite different portfolio strategy which depends on the current level of volatility in the stochastic volatility model.
This is a joint work with Ronnie SIRCAR (Princeton University, United States).
WorstCase Bounds for Expected Shortfall under Marginals
Anulekha DHARA, Singapore University of Technology and Design, Singapore
In many applications like finance, revenue management, energy systems etc. the information available is often based on the historical data or different expert opinions which may lead to noisy, imprecise and/or incomplete data observations and hence the nonexistence of a joint distribution or invalid/inconsistent information. In recent years, there has been a growing interest in estimating bounds on expected shortfall risk (also referred to as Conditional ValueatRisk) of portfolios when the probability distribution of the underlying risks are uncertain. In this paper, we develop a bound for the expected shortfall for the portfolio optimization problem with only marginal information at hand where the univariate marginals are fixed while the additional bivariate marginals can be perturbed. When the univariate and bivariate marginals are consistent with an associated joint distribution, the bound interpolates between the standard univariate bound and the bivariate bound. When the given marginals are inconsistent or no joint distribution can be associated with the given univariate and bivariate information we solve this problem using a two step approach: (a) by finding the closest consistent bivariate marginals using KullbackLeibler divergence, and (b) by computing the worstcase expected shortfall.
This is joint work with Bikramjit DAS (Singapore University of Technology and Design, Singapore) and Karthik NATARAJAN (Singapore University of Technology and Design, Singapore).
A network model for CCP Liquidity Risk Stress Testing under incomplete information
Patrick GE, Singapore Exchange Limited, Singapore
Most network models in the literature are academic in nature and assume perfect information of nodes and linkages; thus results are often simulations that do not reflect the limitation of real world data. Such research can often make generic statements about the behavior of the entire system, but will be less useful for risk management purpose for a single institution which is part of the network.
In practice, a CCP ecosystem has a simpler network in contrast to the interbank network because many bilateral linkages are replaced by the direct manytoone linkages to the CCP as the central clearer. Also, the CCP is often the regulator in the ecosystem and thus has information visibility (of both exposures and financial resources) of all the participants.
We put forth a realistic network model which maximizes the use of data available to a CCP to simulate credit default contagion and study the resulting liquidity needs of the CCP required to avert the default of the CCP itself. This model combines default simulations and a bipartite network to give a loss distribution conditional on a stress state. This “riskbased” approach has many advantages over the “cover two” standard approach of the CCPindustry. To our knowledge, this is the first paper that supplements a network model with a credit default model.
This is a joint work with Xin Yee LAM (Singapore Management University, Singapore) and Max WONG (Singapore Exchange Limited, Singapore).
Simulating Risk Measures
Wei JIANG, National University of Singapore, Singapore
Risk measures, such as valueatrisk and expected shortfall, are widely used in risk management, as exemplified in the Basel Accords proposed by Bank of International Settlements. We propose a simple general framework, allowing dependent samples, to compute these risk measures via simulation. The framework consists of two steps: in the Sstep, risk measure is estimated by using selected sorting algorithm; in Rstep, necessary sample size is computed based on newly derived asymptotic expansions of relative error for dependent samples, and the Sstep is repeated until requirement on relative error is met. We systematically investigate various sorting methods in the Sstep. Numerical experiments indicate that the algorithm is easy to implement and fast, compared to existing methods, even at the 0.001 quantile level. We also give a comparison of the relative errors of valueatrisk and expected shortfall.
This is a joint work with Steven Kou (National University of Singapore, Singapore).
Recovering Linear Equations of XVA in Bilateral Contracts
Junboem LEE, National University of Singapore, Singapore
In this talk, we investigate some conditions to represent derivative price under XVA explicitly. As long as we consider different borrowing/lending rate, XVA problem becomes a semilinear equation and this makes finding explicit solution of XVA difficult. It is shown that the associated valuation problem is actually linear under some proper conditions so that we can have the same complexity in pricing with classical pricing theory. Moreover, the conditions mentioned above is mild in the sense that it can be obtained by choosing
adequate covenants between the investor and counterparty.
Robust Utility Maximization with Lévy Processes
Ariel NEUFELD, ETH Zurich, Switzerland
We present a tractable framework for Knightian uncertainty, the socalled nonlinear Lévy processes, and use it to formulate and solve problems of robust utility maximization for an investor with logarithmic or power utility. The uncertainty is specified by a set of possible Lévy triplets; that is, possible instantaneous drift, volatility and jump characteristics of the price process. Thus, our setup describes uncertainty about drift, volatility and jumps over a class of fairly general models. We show that an optimal investment strategy exists and compute it in semiclosed form. Moreover, we provide a saddle point analysis describing a
worstcase model.
This is a joint work with Marcel NUTZ (Columbia University, United States).
Lifecycle Consumption, Investment, and Voluntary Retirement with Uninsurable Income Risk
Seyoung PARK, National University of Singapore, Singapore
We present an optimal lifecycle consumption, investment, and voluntary retirement model with uninsurable income risk. We investigate how uninsurable income risk affects the retirement, consumption and investment decisions.
This is a joint work with Min DAI (National University of Singapore, Singapore) and Shan HUANG (National University of Singapore, Singapore).
CDS Implied Rating Curves
SerHuang POON, University of Manchester, United Kingdom
In this paper, we explore the extent to which term structure of individual CDS spreads can be explained by the firm rating. Using the NelsonSiegel model, we construct rating curves from a cross section of CDS spreads for each rating class. We find that the fitted rating curves contain meaningful information in the sense that 76% of their timeseries variations can be explained by typical credit and liquidity factors that are known to drive CDS spreads. The residuals, on the other hand, contain mostly transient liquidity information. Moreover, deviations converge towards the fitted curve over time; the larger is the deviation, the more likely is the convergence. Trading strategies exploiting the convergence of deviations generate an average arbitrage return of 3.6% (5 days) and 9% (20 days). Our findings suggest that our implied rating curves contain the core of the credit and liquidity risk information, which could be used to price other CDS and in credit risk management.
This is a joint work with Olga KOLOKOLOVA (University of Manchester, United Kingdom) and MingTsung LIN (De Montfort University, United Kingdom).
Exhaustible resources with production adjustment costs
Cong QIN, National University of Singapore, Singapore
We develop a general equilibrium model of exhaustible resources with production adjustment costs based on singular control. Adjustment costs give producers an incentive to maintain their current production level. As a result, the optimal production policy consists of a noadjustment region, which combined with demand shocks gives natural explanations to some economic phenomena observed in the real markets, such as backwardation, contango, and recently a new stylized fact that Vshaped relationship between slope and volatility of futures curves. It is also known that some factors, for example exploration, can generate a Ushaped price profile, while our model predicts that adjustment costs significantly prolong the period of price staying at the bottom. This can help us to understand why prices of some commodities, e.g. oil, can be quite low for a long time. Finally, when substitute technologies are taken into consideration, our model predicts that reducing production is not optimal when the price of substitute is low. This provides a new perspective to understand why there is a debate of reduction in OPEC.
This is a joint work with Min DAI (National University of Singapore, Singapore) and Steven KOU (National University of Singapore, Singapore).
Gini curves and top incomes
Hui SHAO, National University of Singapore, Singapore
The most commonly used inequality measure Gini coefficient, just like the most point statistical estimators, has lots of limitations and its proper use is always controversial. Instead of using single Gini coefficient, this paper suggests using Gini curve that consists of the truncated Gini coefficients excluding the information of top incomes. The Gini curve turns out to be able to present the full information of an income distribution and what is more important, we provide an axiomatic framework based on weighted expected utility theory to support using such an inequality curve. The properties of Gini curves are examined. In terms of empirical study we investigate the evolutions of the Gini curves for annual individual incomes of United States and Australia, in particular we find the Gini coefficients excluding the top incomes are decreasing significantly for both America and Australia, while the overall Gini coefficients of them remains relative stable or even shows increase tendency, we also demonstrate that such a phenomenon is consistent with the key conclusion that the top income shares are increasing from 1970 in the recent top income research. As a newly proposed inequality curve, we compare the Gini curves with the fundamental Lorenz curves.
This is a joint work with Min DAI (National University of Singapore, Singapore) and Steven KOU (National University of Singapore, Singapore).
Generalized PhaseType Distribution Under Markov Mixtures Process
Budhi Arta SURYA, Victoria University of Wellington, New Zealand
Phasetype model has been an important probabilistic tool in the analysis of complex stochastic system evolution. It was introduced by Neuts in 1975. The model describes the lifetime distribution of a continuoustime finitestate absorbing Markov chains. It has found many applications in wide range of areas such as e.g. in actuarial science, credit risk, financial economics, queuing theory, reliability theory, telecommunications, etc. It was brought to survival analysis by Aalen in 1995. However, the model has lacks of ability in modeling heterogeneity and inclusion of past information which is due to the Markov property of the underlying process that forms the model. We attempt to generalize the model by replacing the underlying by Markov mixtures process. Markov mixtures process was used to model jobs mobility by Blumen et al. in 1955. It was known as the moverstayer model. The model describes lowproductivity workers tendency to move out of their jobs occupancy by a Markov chain, while those with highproductivity tend to avoid job turnover. The model was extended by Frydman in 2005 to a mixtures of finitestate Markov chains moving at different speeds on the same state space. In general the mixtures process does not have Markov property. We revisit the Markov mixtures model for absorbing Markov chains moving at different speeds on the same finitestate space, and propose generalization of the phasetype model with multi absorbing states. The later allows us to cope with competing risks in survival analysis. The new distribution has two main appealing features: it has the ability to model heterogeneity and allows the inclusion of past information of the underlying process, and it comes in a closed form. Built upon the new distribution, we propose conditional forward (causespecific) intensity which can be used to determine rate of occurrence of future events (caused by certain type) based on available past information. Numerical study suggests that the new distribution and its forward intensity offer significant improvements over the existing model.
GoalReaching Problem with Borrowing and ShortSale Constraints
Xiangwei WAN, Shanghai Jiao Tong University, China
In this paper, we consider the problem of reaching goal by a finite deadline studied in Browne [Adv. Appl. Prob., 1999, 31, 551577] with borrowing and shortsale constraints. Comparing to Browne's optimal portfolios, an extra risktaking (when allowed) occurs at any time before the deadline. And near the deadline, either the borrowing constraint or the shortsale constraint is binding to bet on the fluctuation of the risky asset if the wealth is still far from the investment goal. To beat a benchmark portfolio with a high probability, numerical results show that the optimal strategy outperforms the other available strategies a lot.
This is a joint work with Min DAI (National University of Singapore, Singapore) and Steven KOU (National University of Singapore, Singapore).
Pricing of DualPurpose Funds in China
Chen YANG, National University of Singapore, Singapore
In this talk we discuss the valuation of dualpurpose funds, a type of structured mutual funds recently popular in China. We propose a mathematical formulation of the fund contract, and develop a pricing model in terms of a nonlocal parabolic PDE. We find significant overvaluation of B shares of the funds in the market, and identify the main contributing factor behind this.