I’ve been scratching my head over a few blog posts on the Quandl Blog on alpha-decay. I commented on these posts, “Alternative Data – The Newest Trend in Financial Data” and “The Unbearable Transience of Alpha”, in a recent blog post.
Specifically, one line best encapsulates my concern: “Professional allocators will not pay hedge fund fees for the execution of strategies that are on the first year curriculum of any Masters of Finance program.” It makes a lot of sense that if it were easy, anyone could do it. But, according to the efficient market hypothesis, if everyone does it, then “it” ceases to be. Basically, Tammer would suggest that people who are still using things that everyone knows about and does (i.e., P123 users who rely on fundamental data and tired old value investing formulas) are basically guaranteeing their own mediocrity. He propose “alternative data” as one way of maintaining an edge.
This idea is not all that “fringy”:
Bill Miller recently contended that conventional value screening methods increasingly return value traps (i.e., low valuation ratios that deserve to be low) on episode 117 of the Investor’s Podcast. Howard Marks recently remarked that typical investors will achieve typical performance – he also stated that to be different, one must think differently – in a recent interview with Barry Ritholtz.
I’ve accepted the idea that the alpha content of fundamental and other conventional data is decaying. As a result, I should rely more heavily on recent performance (vis-a-vis, Bayesian inference) to infer future performance, but not so much lest I forget the past. I’ve also accepted that I should make my personal investing system as unique and as “irreplicable” as possible. But I still can’t shake the notion that maybe I am decade or so late to the game – that despite all my best efforts, future performance will be mediocre because I am using data that everyone knows about and has access to. Still, I remain hopeful that the somewhat more proprietary nature of Compustat’s Financial Statement Balancing System is more resilient to alpha decay than commodity fundamental data.
While the alpha-decay hypothesis does make sense, it contrasts with one of my basic philosophies: Keep It Simple, Stupid (KISS). KISS is crude way of saying that elegance and simplicity almost always beat complexity and opacity. Simplicity works because it simple things are easy to understand – it decrease the likelihood of over-fitting and conflation. But how should one reconcile the alpha-decay thesis with the KISS philosophy?
Each time I attempt to answer this question, I just come up with more questions, each iteratively more difficult to answer:
Does it really mean that complicated systems are less susceptible to alpha-decay due to their difficulties to replicate and low likelihoods that they will be replicated?
How much more efficient is the market now as a result of the proliferation of algorithmic trading than it was in the past? How much more efficient can the market become?
At what point does active investing no longer pay off?
On the other hand, will the mega-trend towards passive management preserve or even grow market inefficiencies in the future?
Is it possible to achieve a golden balance between simplicity with uniqueness? One that balances ideas which are understood to be fundamentally sound with a complex implementation… one that is both technically within my ability to automate and yet is still resilient to the forces which are currently assaulting the moat of my alpha.
I’d would love your thoughts…