8 March 2012

The theorem of option pricing made EZ

I am writing this to convince an analyst friend of mine that the so-called theorem of option pricing has nothing to do with probability and that, philosophically, is very simple.

I will prove the fundamental theorem of option pricing in a trivial case.

Suppose there is a box which transforms the dollars you put in into something of different value. For example, I put 1 dollar in the box and this becomes either 10 dollars or 0.01 dollars. The problem is that I don't know what the output of the box is and also I know nothing about the probability of the outcome. All I know is that 1 dollar turns magically into something else: either 10 dollars or 1 cent.



More generally, suppose that the box takes a token that is valued at $S$ dollars and spits out another token that is valued $S'$ dollars which could be higher or lower than $S$. To be concrete, and also keep things simple, let's say that $S'$ is either $(1+b)S$ or $(1-a)S$. If we put $u$ tokens in the machine, then the machine will spit out exactly the same number tokens all of which will be valued the higher price or all at the lower price. We allow the number of tokens to be any positive number, for example 2/3 of a token is possible. Assume that $0 < a < 1$ and $b >0$.

Now, me being a smartass, tell you the following: "Listen buddy, the machine makes money, not all the time, but sometimes. I give you the following option: You won't have to do anything. I will operate the machine for you. If it makes money I will give you some. If not, you won't get anything."Oh, great", you reply, "go ahead". "Well," I say, "you know, you have to pay me a bit now, so that you get the benefits later." "How much," you ask. "We'll figure it out", I reply.

To make things general let's say that our contract is a certain function
$f(S')$
meaning that if the machine turns changes the value of one token to $S'$ dollars then I will give you $f(S')$ dollars.

My rationale is as follows. I'm not a sucker. I won't risk anything at all. I will charge you $X$ dollars and, with this, I will buy $u$ tokens, costing me $uS$ dollars, and put the difference $c = X-uS$ aside. I will put the $u$ tokens in the machine and the machine will change the value of each token to $S'$. In the end, I will have $uS'$ dollars from the machine, plus $c$ aside, which means that I wil have
$Y = uS' + c$ dollars
and since I am a gentleman, I will have to fulfil my promise, meaning that
$Y = f(S')$.
Since $Y-X = u(S'-S)$, we see that
$X+u(S'-S) = f(S')$
must be fulfilled. And this leads to two equations with two unknowns, $X$ and $u$. The equations are:
$X+ubS = f((1+b)S)$,     if the price goes up,
$X-uaS = f((1-a)S)$,    if the price goes down.
Subtracting the second from the first gives
$u = \frac{ f((1+b)S)- f((1-a)S)}{(a+b)S}$.
Putting this back into the second equation, we find
$X = \frac{a}{a+b} f((1+b)S) + \frac{b}{a+b}  f((1-a)S)$.
I observe that my solution is good, because $u \ge 0$ and because both $u$ and $X$ depend on nothing else (not on my astrologer, neither on my mood) except the price $S$ of the token. So I tell you that: I will charge you $X$ dollars. (If $uS$ turns out to be larger than $X$, then I will temporarily borrow $c$ dollars and return them at the end.)

That is all.

Now that you have learned the above, you can create a dictionary of jargon:
  1. Market: it is the box you see above in the picture.
  2. Share: the token.
  3. Stock: a set of tokens.
  4. Bond: the quantity $c$; with $c$ positive (respectively, negative) interpreted as buying (respectively, selling).
  5. Portfolio: the pair $(u,c)$.
  6. Hedging strategy: it refers to the number of tokens $u$.
  7. Option: the function $f$.
  8. Price: the variable $X$.
  9. Completeness: it refers to the fact that there is a unique solution $(u,X)$ to the system of equations. (If $S'$ takes not two, but three values, completeness is lost.)
  10. Arbitrage: the absence of arbitrage is that I make no money. 
  11. Transaction cost: I may charge you an extra fee.
  12. Equivalent martingale measure: You can think of a random variable $R$ taking value $a$ with probability $b/(a+b)$ or value $b$ with probability $a/(a+b)$ (these probabilities constitute the probability measure), write $S'=(1+R)S$ and rewrite the equation for $X$ as $X= E[f(S')] = E[Y]$ (one says that $(X,Y)$ is a martingale).
Who could have ever thought that there is such a rich dictionary behind a simple equation?

By the way, what theorem have we proved? Cast in the fancy terminology, we have proved a theorem saying that, in our complete market with no arbitrage, any option can be priced fairly by using a unique hedging strategy which specifies our portfolio in terms of shares of stock and bonds.

In reality we have proved that I lure you to put your money in the magic box, that I have no risk of losing anything, and that it is you who bears all the risk. However, by charging a bit more than the fair price $X$, by doing the same not just with you but with a few thousand other people whom I attract by designing fancy options $f$, I surely make some money.

No comments:

Post a Comment




T H E B O T T O M L I N E

What measure theory is about

It's about counting, but when things get too large.
Put otherwise, it's about addition of positive numbers, but when these numbers are far too many.

The principle of dynamic programming

max_{x,y} [f(x) + g(x,y)] = max_x [f(x) + max_y g(x,y)]

The bottom line

Nuestras horas son minutos cuando esperamos saber y siglos cuando sabemos lo que se puede aprender.
(Our hours are minutes when we wait to learn and centuries when we know what is to be learnt.) --António Machado

Αγεωμέτρητος μηδείς εισίτω.
(Those who do not know geometry may not enter.) --Plato

Sapere Aude! Habe Muth, dich deines eigenen Verstandes zu bedienen!
(Dare to know! Have courage to use your own reason!) --Kant