Sunday, December 30, 2012

The most useful things I learned in one summer internship...

In Google Chrome:
  1. Ctrl-L brings you to the URL bar.  Think of all the time you'll save not having to move your hand from mouse to keyboard everytime you want to go to a new website.
  2. You can 'train' Chrome to remember how to search a website.  As an example, go to Amazon and do an empty search.  Then, when typing amazon into the url bar you can press tab and enter a search that will be performed on amazon.  I find this really useful for browsing Wikipedia.
Happy holidays.

Friday, December 28, 2012

Breaking down the Black Box

This is a bit of a re-blog of other people's work, but I think most people would find this interesting.  Below are some articles that I enjoyed because they take a complicated piece of software and break it down into understandable, bite-sized chunks.  Check these out by your leisure, but note that they're ordered by their inclusion of domain specific knowledge:

  1. Siri
  2. Dark Sky - Tells you when its going to rain.
  3. Divvy - An app that tells you how to split a check with a group of friends.  The process of OCR is far more complicated than I had ever imagined.  Note: the diagrams used are 'state machines' or basically a graph of states with transitions (actions) to other states, and A* is an algorithm to find the best combination of choices given some function to estimate their proximity to some goal (ie: number closest to 0).

Monday, December 17, 2012

The liberator who destroyed my property

"Tell him. Tell him, the liberator who destroyed my property has realigned my perceptions." - Tyler Durden

Not to get all "Fight Club"-ey on you guys, but we need a little destruction in the world every now and then. 

The terms "hormesis" and "mithridatism" refers to the intentional exposure to toxins in order to strengthen the body. There is evidence you can cure some allergies or gain immunity from certain poisons this way. The cost usually isn't worth it (you can become disabled or die), but it makes sense in certain situations (e.g. you are a dangerous animal handler by profession).

This means that certain things, although dangerous in large doses, can be beneficial in small doses. Physical labor can cripple you, but it can also make you stronger. Permitting small forest fires to burn instead of fighting them will reduce the amount of flammable material left for uncontrollable large forest fires. What doesn't kill you makes you stronger.

Schumpeter called the disruptive innovation of entrepreneurs who displaced established economic orders "creative destruction". It's the process that destroys in order to create. In that vein, everything must be fallible, whether it's a bank, a forest, an individual, or even a government.

According to Karl Popper, the difference between science and religion is that science can be disproven - he called this "falsifiability". Any scientific statement can be disproved given good enough contra-evidence. In other words, no scientific fact is "too-big-to-fail", and for good reason: practically everything we know from science that has been proved has been disproved and replaced with something better (e.g. Newtonian mechanics -> Relativity -> Quantum mechanics -> ???).

Nothing lasts forever - least of all complex systems such as government and economies - so why do we persist in believing that it should be otherwise?

I am not merely advocating allowing failure. We should actively create it. Instead of merely permitting forest fires, what if we deliberately instigated them? In wildfire management, "controlled burn" is known as the practice of intentionally lighting small forest fires. This has been proven to be more effective in reducing the inherent instability within forest than fighting every fire that comes along. However, at a certain point, a forest becomes too flammable for this strategy to work: in other words, the forest has become "too-big-to-fail". We tried to let Lehman burn, but it was already too late: the forest was too flammable.

Micro-fragility leads to macro-resilience. This is where the regulators and Elizabeth Warren and Occupy Wall Street and practically everyone else is getting it wrong: we shouldn't be making failure harder, we should making it easier.

Monday, November 19, 2012

Speed Round

Sorry for the lack of posts recently. I've been busy cramming for the CFA.

One of my favorite newsletter writers does something he calls the "Speed Round" in which he shoots out somewhat random short paragraphs or one-liners. I have many (but not quite fifty) thoughts that aren't worthy of expanding into standalone posts (rather, I'm just too plain lazy). So, here's my dollar:


This was the first political election I ever really paid attention to. What do I have to show for it? I'm a marginally more knowledgeable but much less happy person. Unfortunately, ignorance was bliss.

Ron Paul

Ron Paul's final term as Congressman is coming to an end. I have mixed feelings about this. Ron Paul was a unelectable wacko, but at least he was my favorite unelectable wacko. He taught me to be skeptical of politicians, which taught me to stop believing in him. I don't think any other politician could ever give that lesson.


In a democracy where each individual is independent and gets exactly one vote, the consensus isn't the average. It's the median.

Democracy pt. 2

The free market is a voting system where dollars are votes and what you vote on is the allocation of resources towards goods and services. The better you vote, the more votes you get back (aka return on investment). So this system tends to redistribute votes to people to vote better ("better" being defined by market consensus). In a way, it's a form of direct democracy.

Democracy pt. 3

One problem with the free market is that some people start out life with much more "votes" than others (e.g. Mitt Romney). High inheritance and estate taxes would eliminate this anti-competitive behavior. Thus, the only people who would have a lot of "votes" would be those who accumulated them throughout their lives. It's kind of like anti-trust law, except against individuals instead of monopolies.

What about people on the other end of the spectrum? A minimum level of "votes" per person could be guaranteed through cash transfers to the qualifying poor. This is another reason why cash transfers should be preferred to direct handouts of goods and services. A government-mandated mix of goods and services in place of cash transfers denies the poor this voting opportunity.


Awesome word attached to a ridiculous practice: manipulating the geographic boundaries of electoral districts in order to gain a political advantage. The name comes from Elbridge Gerry, who was a notorious gerrymander-er (although not the first) and salamander, which was the shape of his contorted districts.

In green is Illinois's 4th congressional district. (Note: this is one single district, not two)

Here's a 2002 piece by The Economist on this practice. Their subtitle: "In a normal democracy, voters choose their representatives. In America, it is rapidly becoming the other way around."

Ethical Utilitarianism

The Life You Can Save is kind of like the Bill Gates-Warren Buffett Giving Pledge, except for normal people. It's a pledge to donate a portion of your income to charity. They suggest a very modest 1% for those who make less than $105k (Buffett is pledging 99% of his wealth to philanthropy). I've taken the pledge. You should too.

On a similar note, unless you're an ancient Egyptian, everyone should be organ donors on their driver's license. If you're not convinced, read this letter from Ken Kesey, the author of One Flew Over the Cuckoo's Nest.

Awareness Days

There are too many "awareness" days. For example, there's a "Pregnancy and Infant Loss Awareness Day", a "Stillbirth and Infant Loss Remembrance Day", and a "Pregnancy and Infant Loss Awareness Month"... none of which I was aware of. If you take a good idea ("let's have more awareness days!") to its logical extreme ("let's have as many awareness days as possible!"), sometimes you make things worse.

Hedging Against Life

Every decision you make in life is an implicit bet. Even the lack of a decision is a bet (a bet against all alternatives). There are certain large bets that almost everyone makes in their lives: choice of career, renting vs home-ownership, buying a car, etc. which give you concentrated exposure to a few extraneous risk factors. It's important to recognize what these risk factors are, and consider ways to fully or partially hedge against these risks if they are unwanted.

Hedging is also known as insurance. The basic principle to hedging is to place an opposite, offsetting bet. Some basic terms: being "long"/"short" means to bet that something will go up/down, respectively. For example:
  1. If you have a job, you are implicitly "long" 1) your employer, and 2) your industry. To hedge, you should "short" the stock prices of your employer and your industry.
  2. If you buy a house, you should short real estate.
  3. If you frequently drive, you should go long oil prices.
  4. There's a site that lets you hedge against bad grades.
  5. Finally, you can simply make more diversified decisions in life. For example, diversifying your skill set will expand the number of industries that will employ you, thus de-concentrating your career risk.
To take a historical view, a typical construction worker-homeowner-car driver in the mid-2000s had huge risk exposures to the real estate and gas prices. Needless to say, he didn't do very well in the simultaneous housing/energy crisis of 2008.

Diversification and risk management do not merely apply to investment portfolios.

Words of the Week

Wednesday, October 24, 2012

Betting/trading strategies- Chunking

X%- a concept to unify different betting strategies
Following last post, a similar trading cousin for the Martingale gambling strategy is the sit and wait strategy (buy and hold with no stop losses and exit when you reach a predefined take profit level). This trading strategy is similar to the gambling strategy in the sense that both depend on the market/game to eventually do something (if I wait for long enough/try enough times it will eventually happen and I will be profitable overall). Sit and wait is like a more conservative version of Martingale. Lets say for the Martingale, instead of doubling down each time you lose, you increase your bet by x%. For example, x could be 200% (triple down) or 0% (sit and wait for casino bets). The stock market equivalent is re-balancing and increasing your original exposure by x%. (ie. let's say you had invested $100 in stock, it goes to $75. For Martingale strategy you invest another $125 to make it 2x original exposure = $200) This rebalancing assumes you are still under the same return distribution situation.

As we increase x, we are more likely to win/get back to breakeven when we are losing. For example, let's say we just lost k in a row. For doubling down on a 50/50 bet, we are 50% likely to get back to winning $1 overall by the next turn, but for sit and wait, we are only (1/2)^k likely to get back to breakeven.

At the same time, our losses increase at an exponential rate (1+x%) if we keep losing. This creates an embarrassing problem- how do you avoid bankruptcy if your losses increase exponentially? This goes back to the "chunking" mentioned in the first post. In trading this would be how much extra margin/loss buffer should you budget for this trade before it starts to win. And if you are going to double down/add, at what point should you do this? ie. if you go bankrupt/can't take the pain/extreme exposure after the third time you are wrong, then you need to split the potential worst case scenario into 3 segments and add as you cross between segments. In general you would split these segments by P&L, but it could also be possible to split it by underlying price movements, or even time or a function of all of the above. In the case of the stock mkt, it is also meaningful to vary x as a function of the underlying stock price movement, p&l change, time and other parameters.

Chunking- trade planning/margin budgeting
Let's derive the formula for how to chunk. For gambling, this is relatively easy. You lose b, b*(1+x), b*(1+x)^2,...,b*(1+x)^(n-1) So our total loss is a geometric series and equal to b*[(1+x)^n-1]/x. So given n (how many times you lose before you bankrupt) and x (the double down amt), you can calculate how big your bankroll has to be relative to bet size b:  bankroll is [(1+x)^n-1]/x times that of bet size b. For example, if x = 100% (martingale; doubling down each time), n = 10 (will go bankrupt if wrong 10 times in a row), you need 2^10 - 1 = 1023x initial bet.

It is however more complicated for trading, since you don't lose everything that you bet. Let's define a set {p_0, p_1...p_n} to stand for the stock price at which we would dial up our exposure another 1+x. We could have the set be in arithmetic progression (eg: keep adding per every $1 price drop), or it could be a function of P&L (eg: keep adding per every $100 loss), or time (eg: keep adding every 1min- here we would not be able to predict what p_i is before time t).

Let's start with the common “Martingale-like” case where p_i is an arithmetic sequence, we have a defined stop loss pt p_sl determined by the stock price terms (not P&L), and we rebalance by doubling the # of shares that you own each time. Then p_i = p_0 + (p_sl - p_0) * i / n. And the shares you buy each time b_i = b_0 * 2**i. (note ** means exponential).
Your worst loss is sum[b_i * (p_sl - p_i)] where i is the set of integers from 0 to n-1. Courtesy ofWolfram Alpha, we see:

Since m = n-1, this simplifies to b_0 * (p_sl - p_0) * [2**(n+1) / n - 1 - 2/n]. So let's say you had 1 mil budgeted to buy this stock, and you wanted to bet 4x before you go bankrupt, and you expect p_sl - p_0 = 100-70 = $30. Then we can look at how much (# of shares) to bet initially if we want to be able to rebuy n times.

We can also create a table showing how the intial bet size (both in terms of shares- b_0 and also as a % of total risk budgeted/worst case loss) increases with n.

Initial Bet
NShares% of Budget

Note that, for example, for n = 1 (just bet once and don't redouble ever), then you would be buying stocks with a notional amt that is 3x your risk budget. Notice that this does not necessarily mean that you are "taking on leverage"- for example, if you had 10mil, and budgeted 1mil for this trade, and so bought ~3mil, then you are not going over your total account equity. The traditional concept of "leverage" is insufficient to describe your trades/risk levels when it comes to more complicated trading strategies.

A more useful version for trading is perhaps this:

Initial RiskLoss
N(as % of budget)Multiplier

And then to get how much notional to invest, you simply divide this percentage by the worst case scenario return. eg: N = 2, worst case is -25%, then you should buy notional amt of 0.50 / 0.25 = 2x that of risk allocation initially. The loss multiplier is simply the inverse of the risk %. You can also use this to figure out what your max loss will be if you bought some amt and you are not sure how many times you will redouble:

eg: you bought 100 shares, and prices went from 120 to 110. You have been redoubling every $5 (so this will your third buy). If you are wrong again, then by 105, your total loss will be 1 / 0.27 * (100 * 15) = 5.5k. Let's check this:

Current Price105
Total Loss-5500

So this checks out.

Food for Thought- Thinking about Risk
This x% concept can be thought of as a trading style/bias. People who trade by x% (ie. doubling down on losses and taking profits vs cutting losses and adding to winners) are essentially trying to convert a payoff distribution that is linear to stock prices (ie. if stock goes up $1, you make b, if stock goes down $1, you lose b, where b = # of shares), to a distribution that is non-linear and path dependent. It is more complicated to understand and incorporate the risks involved with this sort of dynamic trading strategy when you are evaluating your portfolio's risk characteristics. You might realize that what I described is very similar to a description of options. There are interesting ways to integrating traditional methods of looking at options risk and concepts presented here (eg: chunking and taking into account your trading style):

(1) Bringing options concepts to x%. Your trading style (especially in the extreme case where you preset limit/stoploss orders religiously) can essentially be converted into discretized chunks of gamma. Using this, you can do standard risk analytics taking into account your intended future trades (eg: the increased tail risk? the expected theta?). Another possibility is to use x% as a way to normalize for some portfolio measures. eg- if you use a high +x% for this bucket of trades, then you would expect high success rates and probably equal-ish avg profit vs avg loss but also significant long tail losses. In fact, you could also back out an implied x% from your success rates or your avg profit vs avg loss.

(2) Bringing risk budgeting to portfolio analytics on derivative positions. One problem with current scenario analysis and stress testing techniques is that they is usually purely based on past data, and that they are used as a control- ie. someone/some software will give us this number and let's just keep our losses under this threshold. One thing to do is to be more logical when it comes to estimating the risk budget/max loss. For example, we developed a formula for max loss (given n) above. I believe that a model based on fundamental characteristics (eg- for stocks this could be using a P/E range to derive a max loss) could be an interesting supplement to standard risk models nowadays. Unfortunately, traditionally, the risk manager is not involved in the investment research process and so lacks the in-depth fundamental knowledge required to create such a model.

Wednesday, October 17, 2012


I'm a big fan of rule-based policy over discretion-based policy. This is especially important in areas which 1) inherently have a lot of uncertainty and 2) affect lots of people, such as fiscal, monetary, and regulatory policy.

Rules are a predetermined, objective and comprehensive set of responses to changes in inputs (e.g. changes in economic conditions). You should think of rules as computer-implementable replacements for human policymakers. This has advantages in reducing uncertainty, encouraging transparency to the public, enforcing government discipline, being resistant to time inconsistent behavior and providing optimal economic policy.

Rational Expectations vs. Uncertainty

One of the most important foundations of modern economics is that individual agents (me, you, workers, corporations, etc.) are rational and will discount future expectations. Thus, one of the most important channels in which government policy acts is through expectations management. If you expect interest rates to go up in the future, you will borrow more now. If you expect taxes to go down, you will defer consumption for later. And so on.

Uncertainty about future policy prevents these channels from working correctly. After all, in order to discount future expectations, you need to know what policy will look like in the future. Uncertainty, at best, undermines the public's confidence in politicians and at worst, can cause or deepen a recession (e.g. by essentially freezing consumers and producers in place). Furthermore, when a fundamental assumption of modern economics, rational expectations, is no longer true, most economic theories fall apart.

The advantage of rules is that it eliminates uncertainty. Furthermore, simply by doing this, it makes policy more effective, thus necessitating less drastic policy changes, making the policy path smoother and less volatile (no artificially induced fiscal cliffs here). Furthermore, rational expectations will be more applicable to the real world.

Monetary Policy

Monetary policy refers to the central bank's actions of controlling the money supply, usually through targeting interest rates. Expansionary monetary policy refers to increasing money supply and lower interest rates, which leads to higher levels of economic growth at the cost of higher inflation. Contractionary policy refers to decreasing money supply and higher interest rates, which lead to lower economic growth with lower inflation (or deflation, negative inflation).

Currently, how monetary policy works is that Chairman Bernanke calls a closed-door meeting with the rest of the Federal Reserve Board of Governors. After the meeting, they issue a short one page press release immediately. Three weeks later, they release the more in-depth minutes. Scores of private economists and consultancies make their business forecasting Fed policy through official and unofficial statements by Fed officials. Even minor word changes between successive press releases, such as from "growth in business fixed investment appears to have slowed" and "growth in business fixed investment has slowed" are analyzed and interpreted.

In contrast, the Taylor rule replaces a discretionary interest rate regime with a simple three-variable equation (inflation, real rates, and GDP). In a Taylor rule regime, a computer collects data for the inputs, plugs it in, calculates the equation output, which is set as the new interest rate. The parameters and data are publicly available, so the people can easily follow along in real-time.

Nobel Prize-winner Milton Friedman also had an even simpler rule: you simply grow the money supply at k percent. If a rule is too complicated or has too many parameters, that just replaces the original source of uncertainty for new ones. Friedman understood this well with his simple k-percent rule.

Rules are especially applicable to monetary policy since the Fed is constantly playing a game of "expectations management" with the public. If you ever read Fed minutes or listen to Fed statements, you will know how often they emphasize maintaining credibility. This is how it works: monetary policy is credible because people believe it works because people believe monetary policy is credible because monetary works because...

The moment that the public stops believing in the Fed's promise to maintain its commitment to price stability is the moment that prices become unstable. Bernanke's claim that he will keep rates low until 2015 is not credible, simply because he will leave office in 2014. However, if a computer program were Chairman instead, its forecasts of its future actions would be credible (provided someone locked the computer and threw the password away to prevent tampering), because the public would know exactly what the computer program is likely do, since its programming would be transparent and open to the public. 

The main problem is that it's difficult to say what is the best rule. Should we follow Taylor or Friedman? Evan's ruleNGDP target? Or something completely different?

Fiscal Policy

Fiscal policy refers to government spending and (tax) revenue collection. Expansionary fiscal policy refers to increasing spending and/or decreasing taxes, which creates deficits, runs up debt and boost the economy. Contractionary policy refers to decreasing spending and/or increasing taxes, which creates surpluses, decreases debt and slows down the economy.

Deficits aren't bad in and of themselves, if they are balanced by surpluses in other years. Unfortunately, governments tend to have a bias towards deficits. One reason is that politicians like to boost the economy in order to ensure re-election. Another reason is because deficit spending is a transfer of wealth from young generations to old generations, and politicians tend to belong to the latter.

One way to think of debt is your present-self borrowing from your future-self. It might seem that your creditor is your direct lender (bondholder, credit card companies, mortgage banks, etc.). However, they are merely middlemen between your present-self and your present-self's ultimate creditor, your future-self. In developed nations, older demographics tend to see most of the immediate payoff of government spending (social security, medicare, etc.) and furthermore, they will unlikely be around when debts need to be paid off.

Thus, deficit spending is a transfer of wealth from the future (young) to the present (old). Older generations tend to be more politically established than younger generations (the average age of a US congressmen is around sixty). Of course, none of this is a problem when economies are developing and there are much more young than old (as in the leftmost pyramid). However, what happens when an economy stops being youthful (as in the rightmost pyramid)?

File:DTM Pyramids.svg

Fiscal rules can be thought of limits on either the spending side and/or the tax side. Some types of rules that you may already know about are 1) balanced budget amendments and 2) debt ceilings.

A balanced budget amendment would require non-negative deficits in every year. The problem with this type of policy is that it's inflexible and inherently pro-cyclical. Since in recessions, real incomes fall, tax revenues fall, which necessitates an increase in the tax rate in order to maintain tax revenues. Ideally, a rule should be counter-cyclical.

You may already be familiar with the US debt ceiling debacle of Summer 2011. The problems with this type of rule are that there has been no real consequences for missing it (not for the better part of the past century at least), and as a result, the debt ceiling has been raised 74 times. This is the equivalent of setting a clock alarm in order not to be late for work, but upon waking, hitting the snooze button about a dozen times. If your clock didn't have a snooze button, you wouldn't be so ready to fall back to sleep upon hearing the alarm go off. Ironically, the very existence of a snooze button decreases your willingness and ability to rise in the morning. Thus, in addition to being counter-cyclical, the ideal rule needs to be credible and thus, difficult to change.

Unfortunately, most of these rules are determined on an aggregate top-down level, and thus, have no meaning to individual lawmakers. Instead, spending increases and cuts can be (and frequently are) decided bill by bill. A lawmaker's immediate interests lie not in meeting some high-level target, but rather, in ensuring he gains federal funding for his pet projects. Thus, rules should instead target individual legislation rather than annual aggregates.

For example, one favorite idea of mine is that legislation needs to come in pairs: spending bills must be accompanied with revenue (tax, etc) bills. The exact proportion doesn't have to be dollar for dollar. In fact, you could have an independent board target the spending:revenue ratio, somewhat like a fiscal Fed. For example, a Keynesian board would dictate a spending:revenue ratio >1 during economic slowdowns and a ratio <1 during economic booms, in true counter-cyclical style.


While the prospect of electing computers to presidential office may never come, there are places for strict but transparent algorithms in government. This reduces uncertainty about government policy, allowing rational expectations to work and making purchasing and investment decisions easier for both consumers and producers. As political gridlock is unlikely to go away for the foreseeable future, continuing uncertainty over fiscal issues (such as the fiscal cliff) and monetary policy (such as the end of Bernanke's term in 2014) shows that discretionary policy is mainly just terrible policy.

Thursday, October 11, 2012

E-commerce same day delivery

It's interesting to see so many e-commerce businesses warming up to the idea of same day delivery. tried a more ambitious version of the idea (1-hour delivery) back during the tech bubble, and failed when the bubble burst, but now,, ebay, and Walmart are all looking to offer it to their customers. There are significant and expensive logistical hurdles involved in offering same day delivery, even for a brick-and-mortar like Walmart. It remains to be seen how online businesses who choose to offer this service will negotiate these difficulties.

If anyone is up to the challenge, its Amazon. For the time being, they STILL don't have to collect sales tax in most of the states. They've even worked out deals with some states to delay collection of sales tax for several years into the future, in exchange for building vast warehouses that employ thousands in these states. These warehouses will, in turn, serve as the logistical backbone that allows Amazon to ship products even quicker to their customers.

The benefits of offering same day delivery are quite clear. It is the holy grail of convenience shopping. Shoppers have had to choose between buying online for a lower price and waiting for delivery, or driving to a brick-and-mortar and getting the product right away. With same day shipping, online retailers offer even more instant gratification than their offline counterparts. Imagine you completely forget that its your SO's birthday. You order her a gift and a nice card while you're at work, and its waiting for you at your home (or Amazon locker) by the time you get back. You don't even have to spend time and gas driving to the store.

Its still unclear, though, if same day delivery will be worth the effort for a general retailer like Amazon. Already, Amazon's position as a cost leader is eroding. No doubt, adding huge warehouses stocking tons of products across the country won't help lower expenses. Amazon's acquisition of Kiva systems could potentially cut expenses considerably - we'll have to wait and see. Most importantly (in my opinion), for many products Amazon sells, convenience isn't a huge factor. Do you really need to get a book or a TV delivered right away, or would you rather wait and get the lowest price?

That's not to say I think same day delivery is a bad idea - in fact, depending on the product, I think its a great idea. Groceries and drugstore goods, for instance, fit the model well. Time-sensitive purchases also work (e.g. air conditioners, last minute buys, and broken important stuff). Even managed to be profitable in a few regions before it closed down. And while Amazon doesn't really sell these types of products yet, it wouldn't be the first time Amazon jumped into new retail sector. Will definitely be keeping an eye on what Bezos decides to do.

Anyone got any thoughts on the issue? Will same day delivery work (and if so, how), or is it over-hyped?

Tuesday, October 9, 2012

WOTD: bloviate, hirsute

First it was palaver (to ramble unnecessarily), then it was bromide (platitudinous and boring), now we can add bloviate and hirsute to our arsenal of awesome words found from political commentary.

Bloviate [bloh-vee-eyt]
adj. to talk at length, especially in an inflated or empty way

Bloviate is a compound of blow with a pseudo-Latin ending. It was popularized by none other than former President Warren Harding, who was apparently an expert at it, in the 1850s - he used to describe it as "the art of speaking for as long as the occasion warrants, and saying nothing". This reminds me of Fedspeak, which refers to the incomprehensible jargonistic dialect of Greenspan when he was chairman of the Federal Reserve Board.

Hirsute [hur-soot, hur-soot]
adj. hairy; shaggy

From The New Yorker, "often dismissed even on the right as a hirsute blowhard, [paleo-con John] Bolton appears to have persuaded Romney to take him seriously".

I love election commentary: it teaches me new creative ways to insult people! Especially meat-head jocks who might not understand the insults themselves, such as the kind typically found at upcoming five-year high school reunions.

Friday, September 21, 2012

Education: Signaling Model vs Human Capital Model

I was going to selectively quote from this article by Bryan Caplan, but I ended up copying and pasting the whole darn thing. Just read the article; it's pretty short.

Bryan Caplan is a economics professor who writes a lot on the "signaling model of education": the idea that education is valuable not because of the knowledge and skills it teaches (as per the "human capital model") but rather because of the certification and fancy initials you get to put after your name (B.A., MBA, LL.D., Ph.D, etc.). In other words, education is valuable as a "signal" to others of your supposed abilities, regardless of whether you actually have them or whether said abilities are even relevant.

This isn't a new idea in and of itself, but it's become increasingly relevant as youth unemployment remains high, tuition costs remain unaffordable, federal student loans are reexamined and student debt reaches record levels.

Bryan points to a lot of strange distortions due this signaling effect dominating the pure intellectual effects of education. For example, if human capital model were correct and education's true value lay in its transfer of knowledge and skills, then everyone should audit.

The best education in the world is already free of charge. Just go to the best university in the world and start attending classes. Stay as long as you want, and study everything that interests you. No one will ever "card" you. The only problem is that, no matter how much you learn, there won't be any record you were ever there.

So why doesn't everyone do this? One theory could be that everyone actually really wants to, but the physical difficulties of being in the right location (geographically near a top university) and free at the right time (class hours are usually at daytime on workdays) are prohibitive. Fortunately, these are the problems that online courses (or if you prefer, MOOCs) try to tackle.

Thus, if the signaling model were false, instead of going to a real universities, online courses (of the same academic caliber and rigor) should be a perfect substitute. In my ideal world, they would be, but of course, that's not actually the case. Otherwise, why would online courses be trying to move into the certification business?

Has this always been true? More specifically, has the signaling effect always dominated the pure intellectual effects of education? I would argue no. Then why has it changed? Very simply, the price of knowledge has significantly decreased. This is due to technological advances and the democratization of information. In fact, the existence of sites like Wikipedia shows that knowledge is practically free (free as in free beer, not free speech). It doesn't make sense that as the price of knowledge decreases, the cost of education hasn't gone down (rather, it's gone up).

The final question remains: even if you're right, so what? As long as rational employers are fully conscious of the signaling model and completely cognizant of the fact that their hiring decisions are based on possibly nothing more than fluff and air, why should we care? If they choose to hire an ignorant Harvard grad over a more capable state school grad because they're too lazy to do their due diligence and look beyond credentials, then that's their loss. They'll fully bear the costs of their own decisions, as they are forced to pay for employee training programs to teach the skills that universities never did.

However, the reason that we should care - that everyone should care - is because of the negative externalities this imposes on the system. This is visible in the higher degree treadmill process observed in the US labor market: today we have so many bachelor's degree holders that job seekers are increasingly jumping to master's degrees. This has reached the point where jobs previously only requiring bachelor's degrees now require master's degrees, despite no obvious increase in job difficulty or worker aptitude. These inefficiencies incur costs that will eventually be born by everyone.

Of course, the real world is rarely this black and white. Most realistically, education displays both effects of the signaling model as well as the human capital model. There are still people who care about learning for learning's sake and there are still industries which select candidates on a more meritocratic basis (programming and investing are two examples). Unfortunately, this seems to be the exception rather than the rule.

Wednesday, September 12, 2012

WOTD: ersatz, zaftig, tchotchke, risible, rigmarole, bromide

New words of the day!

Ersatz [er-zahts]
adj. inferior substitute

It's originally a German word meaning substitute, although in German it has with no connotations of inferior quality. In WWII, Nazi-captured POWs were given ersatzbrot, which was bread (brot) filled with sawdust, and English speaking POWs adopted the word after returning from the war.

This next one is from Christina.

Zaftig [zahf-tik]
adj. pleasingly, plumply pulchritudinous; alluringly curvaceous

For a more explicit definition, look up zaftig in Google Images.

Tchotchke [chahch-kuh]
n. a small toy, gewgaw, knickknack, swag, bauble, thingamajig, doodad, lagniappe, trinket, or kitsch

That one was from Conrad. Apparently, it also means bimbo or slut. I'm not sure what Conrad was trying to say to me...

Risible [riz-uh-buhl]
adj. causing or capable of causing laughter; laughable; ludicrous.

Rigmarole [rig-muh-rohl]
n. an elaborate or complicated procedure 

Rigmarole / rigamarole is one of those words that's heard in speech more often than seen on paper. I've always wondered how it's spelled.

Bromide [broh-mahyd]
1. a person who is platitudinous and boring. 
2. a platitude or trite saying. 

Bromide also has a third definition, which is "a salt of hydrobromic acid consisting of two elements, one of which is bromine". From The Economist:

AS I mentioned in last night's live-blog, if sequestration comes to pass, Barack Obama will have to make do with a defence budget roughly equivalent (in real terms) to George Bush's outlay for 2007. That budget surpasses average annual military spending during the cold war. In other words, even with sequestration, America will still be in pretty good shape militarily. It will still spend as much as all of the other big militaries combined. It will still hold an immense advantage over China and the rest of Asia, where the Obama administration is focusing its resources, and Russia, which Mitt Romney thinks is America's greatest foe. [...] This is also a potent critique of Mitt Romney's call for increased defence spending. He offers little explanation of why America must spend more, besides bromides about American leadership.

Romney is a bromide who speaks in nothing but bromides. He leaves a bad taste in your mouth. Probably similar to the taste of bromide.

Tuesday, September 4, 2012

Strategic Asset Allocation: Endowment Model

This is a first in a series of posts covering different competing philosophies of strategic asset allocation*, and how one could replicate these philosophies. This post focuses on endowments.

Endowments (of colleges and universities) are basically funds of funds with two important differences: there's only one client and the time horizon for investments is extremely long term. This gives endowments a few advantages over other investors: they don't have to deal with marketability or PR, and they can allocate to non-public illiquid securities (which can take years or decades to realize their alpha).

Thus, a significant amount of the new thought in strategic asset allocation* is contributed by endowments and their managers.  The traditional strategic asset allocation standard has been the Modern Portfolio Theory-influenced 60/40 stocks/bonds portfolio that financial advisers love to reference (even now, I'd wager that most 401k's look like this). However, there is nothing "modern" about MPT or these portfolios - it's a sixty year old model. For the post-MPT world, there are a few competing philosophies when it comes to asset allocation:
  1. the endowment model (most commonly represented by Yale and Harvard)
  2. the Norway model (a sovereign wealth fund)
  3. Risk Parity (most famously espoused by Ray Dalio of Bridgewater)
Here is a good summary by one of my favorite bloggers of what exactly the endowment model entails:

DIVERSIFIED:  Broad diversification is embraced.  
HIGH-RETURN ORIENTED:  Equities and high-return alternatives are favored.  Other than as a safe-harbor, bonds are eschewed.
PERPETUAL INVESTMENT HORIZON:  While endowments have always had a perpetual horizon, the endowment investment model actually attempts to take advantage of this.
ILLIQUIDITY SEEKING:  One means of doing so, is to accept illiquidity and demand, in consequence, a higher return.
GLOBAL:  This is a corollary of diversification, but the endowment investment model was more global earlier than other investors.  The current manifestation of this approach is to overweight emerging markets relative to naive benchmarks.
LONG-TERM IN PERFORMANCE MEASUREMENT:  This is the corollary to the perpetual investment horizon, but merits separate mention.  EndowmentInvestor believe successful endowment management requires a long term view in evaluating managers and strategies.
ACTIVELY MANAGED:  Most endowments use active investment management heavily.  This is a requirement in asset classes like hedge funds and private markets where indexing is not an option.  Many commentators would characterize active management as an essential characteristic of the endowment investment model, but EndowmentInvestor does not believe it a requirement to be 100 percent active.
POTENTIALLY CONCENTRATED:  Those that believe in active management in the endowment world [the majority] avoid closet indexing by favoring a handful of concentrated managers.

As an example, on the right is the asset allocation of Harvard's endowment.

Following the popular nomenclature, I would call this portfolio a 36/13/51 allocation (36% to stocks, 13% to bonds, the rest to alternatives). We can see it's highly diversified, with no more than 14% to any single asset class. Furthermore, the high-return focus is obvious with a 87% combined allocation to equities and alternatives. Private equity, absolute return (hedge funds) and real estate are 37% of the portfolio, reflecting their capture of the illiquidity premium. 

The allocation to public equity makes up 36% of the portfolio, and of that portion, equal parts are given to domestic, foreign and emerging markets. In comparison, free float adjusted market cap weights in the MSCI ACWI (All Country World Index) gives 50% to domestic, 40% to foreign and 10% to emerging markets. Some investors don't even follow the ACWI, instead indexing their equity allocations to the S&P 500. This overweight of domestic equities is called home bias, which reflects the tendency to favor the familiar. Harvard is heavily overweight international equities, especially emerging markets, relative to the ACWI. Although Harvard's public equity allocation is one of the more extreme examples, it's an excellent reminder of the importance of avoiding home bias through global diversification. On a long term basis, most investors would benefit by following the more diversified ACWI rather than the US-only S&P 500.

One significant feature of endowments which was omitted from the otherwise comprehensive list above is the emphasis on real assets. Harvard's allocation to real estate and commodities is 23%. Unfortunately, most 401k options don't give you commodity funds (although some give you natural resource equity funds), but that doesn't mean this real asset bias is impossible to replicate. David Swenson, CIO of Yale, advocates a 20% REIT allocation in his book Unconventional Success (if anyone wants to borrow a copy, ask me). Yale's endowment itself has a 29% allocation to real assets (20% real estate, 9% natural resources). The idea is that although short term volatility could be high, even bubble-like (considering real estate's recent global boom and bust), long term fundamentals are excellent: it's the perfect hedge against inflation, and the world is running out of natural resources

To summarize, an endowment replication portfolio would be heavy in equities (particularly international), real estate, commodities and natural resources. Although private equity and hedge funds aren't available to many investors, finding a good uncorrelated absolute return manager or strategy isn't impossible (he might even be your old college roommate).

Next few posts in this series will cover 1) Norway's sovereign wealth fund and 2) Risk Parity.

*The difference between strategic asset allocation (SAA) and tactical asset allocation (TAA) is that SAA is static and TAA is dynamic. SAA can be thought of as a framework upon which TAA is overlaid, although this is not always the case. Here's a few ways in which the two are combined:

1) On one extreme, you may have a SAA with no TAA, in which case you consistently rebalance to the same static allocation (such as 60-40 stocks-bonds) and never deviate. In this case, 100% of your alpha will come from your SAA.
2) On the other extreme, your strategy could be completely TAA with no underlying SAA allocation. Mebane Faber's TAA model (here's an implementation in R) rotates among the top performing x out of 5 asset classes. This asset allocation has no "base" case since the portfolio can look drastically different at each rebalance period, such as switching from 100% stocks to 100% REITs (as it did recently) in the x=1 scenario. In this case, 100% of your alpha will come from your TAA.
3) Finally, you may have a combination of both (1) and (2). For example, your SAA may be a 60/40 stocks/bonds allocation, but your TAA strategy is to shift +/-x% depending on recent outperformance in a two-asset version of Faber's rotation system. If x=10, your final allocation will vary between 50/50 to 70/30. Both SAA and TAA contribute to your alpha, but obviously as x increases, more of your alpha will be attributed to your TAA rather than your SAA.

This post is mainly about SAA, which is irrelevant to investors who use 1) completely TAA methods, 2) bottom-up individual security selection (such as value investors) and 3) shorter term trading. However, to anyone who has 401k in which there is limited security selection (which usually consists of diversified mutual funds) and short-term trading restrictions (which usually range from a month to a quarter), SAA is pretty much the only game in town since any of those other strategies are pretty much completely forbidden. Although it is possible to use a quarterly rebalanced TAA strategy in a 401k, most investors with 401k's are in the "set it and forget it" or "buy and hold" category, which again is basically SAA.

Monday, September 3, 2012

Jae's thoughts from starting a business

Here are the two of the most important things I feel I've learned throughout the process of starting my own business. Both points are pretty cliched, but I don't think that makes them any less important.

Know Thy Customer

The best business ideas are the ones that customers actually want. Seems fairly fundamental, but its a concept that is oft forgotten. Many businesses end up falling in love with a "cool" idea and launch without paying heed to the customer. Unfortunately, even the coolest ideas make poor businesses if nobody wants to pay for it, and committing to an idea without knowing the customer is an unnecessary and potentially ruinous risk.

One of my new acquaintences gave me a good personal example. He had thought about working on a mobile app that would help people in NYC find clean bathrooms throughout the city for handy, quick use in lavatorial emergencies. Sounded like a good idea - it provides a solid solution to an urgent problem. However, after asking around in the city, he found that almost nobody he talked to would have found the app useful since they already knew where to find a clean bathroom on any block in the city - Starbucks.

My friend averted any major committments and expenses by speaking to his potential users early on; however, not all have been as prudent. WebVan, a once dotcom darling, earned the title of e-commerce's biggest bust of all time in part due to its failure to understand its target market. WebVan's hook was convenience. Customers in specific metro areas could purchase all their grocery needs online and have it delivered to their doorstep. No more waiting in line at the register, no more wasting gas driving to the store. WebVan raised ~$1b and quickly launched. Once again, the idea seems pretty neat, until you consider that WebVan would have to sell regularly to ~10-15% of the target population to support its expenses. Not even has that sort of market share today in the highly standardized consumer electronics industry with today's computer-literate shoppers. There's no way customers would actually want to purchase perishable goods online back in 1999 in sufficient quantity to support WebVan.

Its important to remember what all businesses depend on - the customers. Know who they are, know what they are looking for, and build something that they will actually value.

Inspiration vs Perspiration

As important as it is to get the right, customer-centric idea, an idea is necessary but not sufficient for starting a business. Execution is the key. If you think your business is just an idea, then you don't have much, and it will drive you crazy. Anytime you see someone with a similar idea (and you will, guaranteed), you'll panic. Anytime you see a competitor getting ahead, you'll be disheartened. Your ability to execute is the only constant, and its the only thing that can add value to your business in a unique way that cannot be touched by others.

I ran into a post by Derek Sivers a while back that I feel really illustrates this concept. It states that "Ideas are just a multiplier of execution."


GREAT EXECUTION = $1,000,000

To find the value of a business, one should multiply the two together. Even the best idea is worth nothing more than a NYC lunch without execution.

Anyways, its mad late, and had no intention of writing so much, so I will end it here. Hope you guys enjoyed it, and I hope the points weren't too corny.


Friday, August 31, 2012

What do Programmers do?

As a Software Engineer, people are generally asking me to explain what I do at work.  I myself wouldn't have been able to explain this a few years ago, and thought (correctly) that programmers are mega nerds. I'd always considered programmers as glorified typists.  That being said, let me try to explain what things are really like, from my own perspective.  While reading this try to keep in mind that this is all from my personal experience, though I have had the opportunity to work at both massive and tiny companies.

First: some context.  I'm working at OkCupid Labs as a software engineer, and my position is heavily focused on backend work.  We work on small(er) software projects with the goal of bringing people together.  We're pretty much a startup.

Ok then!  Let's start at a high level, then work our way down; software teams at software companies have programs or program features that we want to add to our applications, and as a software engineer I need to find out how to turn an idea into a functioning (piece of a) program and then write it up.  I'll first go through and do some really high level design - is this feature technically feasible, and if so, what is a good way to structure the program and its data structures?  I'll run through designs in my head, reaching back to stuff I learned in college and things I've picked up since then.  I might doodle some stuff on some paper or a whiteboard to sketch out designs if they're more complex, or if I need to discuss them with someone else.  I may even talk out loud to myself, or try to explain my ideas to an inanimate (1) object because having to explain things often brings out the obvious flaws/solutions.  No matter how I do it, I'm aiming for an efficient and reliable design (2).

If all went well and I've filled my notebook/whiteboard/coworker's head with my brilliant, working designs, then its on to coding this beast up!  I grab my favorite text editor and start writing code in whatever language my group has chosen.  Its kind of like this, but more like writing an essay with proper structure and syntax in which you describe in very precise terms what the computer should do.  Inevitably I'll make some mistakes in writing this essay - maybe this construct here didn't mean exactly what I meant, or maybe I just left a word (or whole sentence, or idea) out.  These will come up later as bugs, which will hopefully be caught by me when I test the feature, but may end up harassing some poor user in the distant future if not found.  Bugs are inevitable - they will always be in your code; all you can do is minimizes your chances of introducing them into the system and finding as many as you can and fixing them.

Something that is really important to mention about coding is that its a very creative exercise.  It's not quite like writing a recipe so much as writing an essay or a poem.  Just because the instructions to the computer must be precise and accurate doesn't mean there's only one way to do it.  Code has a style to it which makes everyone's look a little bit different.  Spacing, naming conventions and commenting in someone's code may make it look very different from someone else's, and yet it may do the same exact thing.  You may find one person's style much more readable than another person's.  Even outside of code style, there are nuances in a program that may make it slightly more or less efficient in terms of CPU cycles or memory.  Its often easy to tell how good a coder is by looking through their code - sloppy, complicated designs are marks of a novice, while experts write code that most programmers will describe as elegant and simple.  Less code is often better code.

While running through this design and implementation of a feature, I won't necessarily be working alone.  I'll be talking to other engineers about how to hook my feature up with the rest of the code, asking their opinions and perhaps working with a partner on the design and coding of the feature.  There may be meetings in which various phases of this process are discussed, and there will always be requirements (things that this program must do).

At the end of all this, I'll get to 'ship' my code, meaning that this code will either be put onto a website for people to use, put up for download, or physically shipped on a dvd or other physical media.  Still, it doesn't stop there.  Unlike school projects, no software project is ever truly finished - there's maintenance and improvement that needs to be done so long as the software is still in use.  Maintenance means fixing bugs and making sure that the program still works given new technologies, while improvements might be adding more functionality to the program, or where you go back to fix some of the design mistakes that were made previously.

Though I haven't spoken about this very much, software development is a huge team effort.  There are product people, designers, frontend engineers, testers, technical writers, managers etc.  Everyone contributes to a system that won't function without all components working together.  Each day we come in, work on a feature, fix some bugs, and discuss the future of the project.  Its an incredibly stimulating job where you're always trying to solve some new problem - and, to me it's incredibly rewarding.

1) I hear rubber ducks work best.
2)  Efficiency and reliability - what do these mean?  Reliability is a bit easier to explain, and means that the software you build should work every time, no matter what buttons you mash, etc.  If it doesn't work every time, then it should at least fail gracefully - no blue screens of death, please.  Efficiency is a bit more of a challenge.  Like most things, a computer has limited resources which include memory in the form of RAM, disk space, CPU cycles, network bandwidth.  If you have an application that uses any of these poorly you end up with a program that doesn't operate at all,  operates poorly on low-end systems, or operates but takes so long that it would take 80 years to compute a solution (check out generating the fibonacci numbers recursively - this algorithm grows at roughly O(2^n) meaning that each extra number you want to generate increases the amount of work needed exponentially).

Saturday, August 25, 2012

Intro to Poker

Yayy, my first post!

I figure I'll talk a lot about poker on this blog since it relates to a lot of interesting concepts: probability/risk, expectation, and psychology.  I'll start with some probability today.

The Theory of Poker: 

If you played a million hands with someone heads up, or a million hands at a 9 person table, the only way you make money in the long run is if other people make mistakes, or rather, make more mistakes than you do.  Mistakes include folding with the best hand, calling with a worse hand (without correct odds), or even making a less positive EV bet (like calling when raising is optimal).  In the end, if you make less mistakes than the other person, you win.

Intro Poker Probability:

To talk about poker, we need to talk about probability first.  If there is 12 dollars in the pot, and you face a bet of 2 on the turn (with one card left to go), and you have the nut (best) flush draw and you know your opponent has Aces (a pair of Aces), you should actually call.  There are 9 outs (cards that will allow you to win the hand)  in the deck, so you need 37:9* odds, or a little over 4:1, and you are getting 6:1, so your call has positive EV.  But poker is never as simple as this since 95+% of your decisions are not on the river.     

Conditional Probability and Implied Odds/ Reverse Implied Odds: 

Conditional Probability is hugely important in poker.  Ask yourself how likely you are to be best, but also ask yourself what can happen if you are in fact best on future streets, and what can happen if you are worse on future streets.  So here's an ex. You raise one off the button (button is the dealer, and 1 off means you're to the right of the dealer and there are 3 people left to act.  3 off would mean 3 seats to the right of the dealer and there are 5 people left to act) with A3s (Ace 3 suited) and the dealer and BB (big blind) call.  Flop is A 7 4 rainbow (no suit matches), so it's great to think your pair of Aces are best, so you want to bet right? Well, you can bet and take down the pot most of the time, but you're only getting called with a better Ace (Ace and a higher kicker/other card) and only a few worse hands.  Let's think about checking (not betting).  You're likely to win the hand at showdown (after all cards are revealed), unless someone hits 2 pair or runner - runner something.  And if the guy behind you has a better Ace and bets, at least you have more information to make a better decision.  So in fact, checking is best here, because of conditional probability.    

You have pocket fives and there's an open raise (the first raise and first action preflop), call and you're on the button (you're the dealer).  So the pot is just above 2:1, and the chance you flop a set (another 5 comes out) is 8/9:1 against.  This sounds terrible, but when you do hit your set, you may win much more than what's in the pot in later streets.  You have a hidden hand and top pair is likely to give up a lot of money in that scenario, so you have implied odds.     As a general rule, the other players need about 20x the bet you're calling pre-flop behind (meaning if the raise is to $7, they need about $140 behind to make your call worth it (to make up for times they fold to you and you don't get paid).

Reverse implied odds is the opposite.  You open raise with AJ and a guy re-raises you preflop.  The pot odds let's say are 2:1, and let's say he's reraising sometimes with worse hands like KQ, AT (Ace-Ten), and if he's got a pocket pair like T's or 9's it's a coin flip, so you should call right? Well no.  If the guy has AK or AQ, you're absolutely crushed and you don't know it.  If an Ace comes out, you're likely to lose a lot of money.  If the guy has A's, K's or Q's and the flop comes out J high, you're losing even more money.  

I'll end this post here, because it's getting long.  But I promise I'll get into more interesting stuff down the road, and how poker is good simulation for other things.  Most importantly, I can teach you how to be positive EV at the casino.

*with a 1/3 prob of winning, you need at least 2:1 odds, odds are listed as failures:successes, unless you say 3:2 "favorite", in which case it's successes:faiures.  It's not very intuitive for us math geeks, but actually gamblers use this because it frames the probability correctly, if you have 2 in the pot and face a bet of 1, you need 2:1 odds.