Blog

Featured

How the Economy Works (According to Ray Dalio)

Over the past three or so years, I have written well over 1,000 pages on personal finance and investing.  Every concept, it seems, requires diving off into another concept. Explanation calls for explication, after all.  Given this background in the explication of often difficult topics, I was stunned by the elegant simplicity of Mr. Dalio’s “How the Economic Machine Works” video.  It is cleverly animated, and you should watch it:

 https://youtu.be/PHe0bXAIuk0

As Mr. Dalio suggests in his concluding remarks, the video presents an oversimplification.  Obviously, the complete science of economics cannot be adequately presented in a 30 minute animated presentation.  However, it does hit the high points, and provides a reference for those of us who have never considered credit cycles and how they run the economy.

Mr. Dalio’s basic premise is that the economy runs “like a machine.”  Because most people don’t understand how the machine works, there has been a lot of needless economic suffering over the ages.  According to Mr. Dalio, this simple understanding was what enabled him to anticipate and sidestep the most recent global economic crisis, as well as informing his investment decisions for over thirty years.  

He uses the machine analogy because at its core, the economy functions in a simple, mechanical way.  Like a natural scientist, he breaks down the functioning of the economy into its most basic parts: Transactions.  Each transaction may be simple in its own right, but there are “zillions” of them each and every day.  Humans conduct these transactions, and they are driven by human nature. This collective action results in three major forces that drive the economy.  The first major economic force he discusses is productivity growth.   The second major force is what he calls the short-term debt cycle.   The third is the long-term debt cycle.   These three factors go a long way in explaining Gross Domestic Product, which is the pulse of the national economy.

The easiest way to visualize how these factors work together is to plot them on top of each other as lines on a graph.  The height of the line (the vertical axis) tells us the level of GDP and the “run” of the line (the horizontal axis) tells us the amount of time that passes.  In this way, we can see how conditions are changing for better or for worse over time.

Mr. Dalio defines an economy as the sum of the transactions that make it up.  Transactions involve the transfer of something of value for something else of value.  In our economy, the buyer usually transacts in cash, and the seller is usually selling some good or service (or financial assets like stocks and bonds).  Any time you buy or sell something, you are creating a transaction. It is critically important to note that the buyer doesn’t care whether you use money that you have earned or credit when you conduct a transaction.  They get their money regardless. Therefore, credit and money spend the same way and a transaction takes place when either is used to buy something.


Featured

Why the Stock Market Will Crash Soon

The Fed says that there will be more rate hikes than Wall Street expected, and market watchers yawned.  Mr. Powell said that inflation is creeping up, and the Fed will nip it in the bud, and the S&P 500 goes up the next day.  Investors are assuring the talking heads that we are in a Goldilocks economy.  Twitter is near $47.  Strange things are indeed afoot at the Circle K.  All of this optimism and forward momentum can only mean one thing: The stock market must crash soon.

I don’t think anyone seriously entertains the notion that multiples aren’t stretched; with the S&P 500 back to within a few points of its all-time high and the NASDAQ is off on a parabolic run—again.  There is assuredly exuberance, but many claim it is rational and that there is a huge potential for upside.  The P/E for the overall index is somewhere around 25x.  That should be pretty scary, but for some reason, it is not.  The Bulls still hold sway, and valuations are stretched beyond reason.  The economy is doing great, I grant.  I also grant that when there is a hot economy, it tends to presage a reversion to the mean.  I’ve only heard mention that the Shiller P/E is around 33x.  In real dollars, that’s higher than just before the Great Depression and higher than just before Black Monday.  It has only been higher than it is now during one period in the history of the market (at least back to 1890).

There are many arguments as to what can stretch the benefits of tax cuts into 2019.  The underlying logic is that such stretched multiples require stellar growth to even come close to justification.  The tax cuts were a one trick pony when it comes to growth.   The idea that the tax cuts will produce enough capital expenditure to fuel enough growth to pay for them was a pipe dream, and with the frenzy of buybacks and mergers, we should be sure that it isn’t going to happen.  We are priced for a Goldilocks market with never-ending stellar growth.

Here is what Professor Shiller thinks about current valuations (when the S&P 500 was at 2,680):

The IMF says growth is going to slow down in 2019.  Warren Buffett is sitting on a huge cash hoard, something Berkshire doesn’t often do.  It makes one think a big stock sale may be coming on.  Ray Dalio sees “asymmetric risk” around the world, and a looming recession somewhere about 2.5 years from now.  Recessions, he points, out usually hit the financial markets around 14 months before you get to a full-blown recession.


The bottom line is that current valuations are beyond reason, and the various actual economic stimulants that the market has benefited from are starting to fade.  We may get one last big push from an infrastructure stimulus package, which we can ill afford.  Any further rise is due to animal spirits and not anything Ben Graham would approve of.  At this stage in the game, reversion to the mean is a matter of when and not if.  If you decided that you are an index investor in the last 8 or 9 years and have been pouring money into the S&P 500, then be prepared to realize what that strategy means.  You will be tested in the near future, and if you don’t have the iron will to stay invested when things get ugly, you should start getting defensive now.

Black Swans in Social Scientific Inquiry

Nobel laureate Daniel Kahneman stated simply, “The Black Swan changed my view of how the world works” (Kahneman, 2011).  On its face, Nassim Taleb’s masterpiece is a book about risk management in the world of finance.  It is required reading in finance programs around the world, and it has influenced some of the most important names in the academic discipline and in the investment world.  I argue that the intellectual legacy of the book is more fundamental than merely understanding risk management in stock and options trading. I believe it is an eye-opening expose on human understanding generally and quantitative modeling specifically, and his key points have massive implications for the uses and misuses of statistical modeling in social scientific inquiry.

A black swan is a highly improbable event with three principal characteristics: It is unpredictable; it carries a massive impact; and, after the fact, we concoct an explanation that makes it appear less random and more predictable than it was. For Nassim Taleb, black swans underlie almost everything about our world, from the rise of religions to events in our own personal lives.

Critics of The Black Swan accuse the text of being “dumbed down” in comparison to the author’s more academic works.  Taleb does favor plain English to mathematical and philosophical jargon, and he does offer copious explanations and examples of his key concepts.  This may cause mathematicians to pause, but it is a boon to the student trying to understand the role of highly impactful yet improbable events in the world.  

Students of the social and behavioral sciences are bombarded with the ideas of normally distributed data and normal probability distributions.  We are told that our models (and the associated tests of statistical significance) rely on these assumptions, but often we fail to grasp just how catastrophically wrong we often are when we violate these basic and often forgotten assumptions of social research.  For the student seeking to truly grasp the point of null hypothesis significance testing and its most critical assumptions, The Black Swan is a must-read exploration of the specification and misspecification of mathematical models that seek to explain our social reality.   

References

Kahneman, D. (2011).  Thinking, Fast and Slow.  New York:  Farrar, Straus, and Giroux.         

Taleb, N. (2007).  The Black Swan: The Impact of the Highly Improbable.  New York: Random House.  

 

The AIs Have It: The Grim Future of Investing

In a recent Twitter thread started by the venerable Jim O’Shaughnessy (@jposhaughnessy), I said:

If everyone were a rational long term investor, the market return would be precisely equal to GDP growth.  I personally am grateful to those that build castles in the air. My biggest fear is the AIs will all converge on the same algos, & start doing identical logical stuff.

As several of Mr.O’Shaughnessy’s followers pointed out, I was incorrect in my relating Gross Domestic Product (GDP) to equity returns.  In truth, GDP and equity returns are derived from different elements. The common thread is that equity returns are a function of GDP because the profit growth of companies is a massive factor in computing GDP.  

GDP returns can be attributed to four elements, as the Atlanta Fed explains:

One way that we calculate GDP is by looking at sources of expenditure—you know, what are people buying? This gives us a standard GDP identity of consumption plus investment plus government spending plus net exports. Consumption is fairly straightforward—it’s the largest component in the U.S. It’s about 70 percent of our overall economy. That’s just the stuff we buy, goods and services. We are becoming a more and more service-oriented economy, but that’s not because we are buying less goods. It’s just because our income level now on a per capita basis is high enough that a lot of us already have all the toasters that we want…. Now we are buying medical care or dentistry or meals outside the home or some services that we couldn’t afford before. And so a transition towards a more service-based economy is not really a bad thing, particularly for high-income countries. It’s just a sign of growing income.

Earnings growth depends on the nominal GDP since the earnings form a part of the GDP.  According to Professor Robert Shiller, earnings per share on the S&P 500 grew at a 3.8% annualized rate between 1874 and 2004.  If we adjust those numbers to take into account inflation, we have a real growth rate of 1.7%. The annual S&P 500 dividend yield averaged just 1.99% between 2009 and 2015.

Stock returns can be attributed to a mere three elements:

  1. The first is the growth in earnings of your particular segment of the market’s overall earnings over a given time frame.
  2. The second are all dividends paid out to you over that time frame.
  3. The third is the change in valuation (P/E ratio) applied to the asset between the start and end date.  

For a single stock, earnings growth can be significantly different than the mean in any given period.  If we choose an index such as the S&P 500 as a proxy for the “market,” we will find a modest yet persistent positive tilt to the upside.  In theory, one would expect market earnings growth to be a flowthrough of GDP growth. If the economy is doing very well and growing at an impressive clip, then American corporations will prosper and “high tides lifts all ships.”  

AIs and the Future of Investing

Many investors see algorithm as a four-letter word, and call an algorithm an “algo.”  The reason for this hatred is that algos move massive amounts of stock at unfathomable speeds. It is reasonable to assume that volatility is increased when the most common “triggers” in the data feed are hit.  For this to work, other traders have to be present to take the opposite side of the trade. The current state of the art is for humans to stick in the variables and set the cut points for executing trades.

Imagine that Moore’s law stays in play and that computing power keeps doubling every 18 months or so. There will come a time in the not too distant further when self-teaching AIs will begin to devise and test all possible models using all possible data and determine the most accurate regression equations possible.  For a while, the firms that build the best AIs will have a distinct advantage and make staggering amounts of money. As AI technology becomes ubiquitous and prices begin to drop (as is part of the lifecycle of all technologies), more and more people will have access to the absolute best possible models.

After massive volatility storms rock the market for a period when these advanced, hierarchical models first appear, things will start to settle down in a dramatic way.  If every investor has a portfolio based on the model with the absolute highest R-square possible, then everyone will essentially want to make the same trades at the same time.  Under current market conditions, alpha is largely achieved through arbitraging irrational behavior by human actors.  If the “animal spirits” are banished to the netherworld, trading will become a very boring pastime indeed.  Irrational panics will be a thing of the past, as will be “irrational exuberance.” My predictions are that:

  • Volatility will drop to all-time lows, and will only spike when new economic fundamental data is made available and when there is a fat-tail geopolitical event (e.g., 9/11).  
  • P/E ratios will shrink to historical lows and trend sideways
  • Productivity will increase manyfold, so GDP and wages will go up at a much-accelerated rate

Given such a market and applying my assertions, the market return will boil down to

earnings growth + dividends

If we plug in the historical averages mentioned above, we get something like 2.0% + 1.7% = 3.7% growth per anum in the value of equities.  That figure puts us in the same ballpark as overall GDP growth. It also puts equity returns in the same ballpark as bond returns. In the not so distant future, the quest for alpha will be relegated to myth and legend, much like the quest for the Holy Grail.

What does “Open” Mean in Open Educational Resource (OER)?

It may seem very strange to those not exposed to the Open Educational Resource (OER) movement, but the everyday term “open” is imbued with layers of meaning, much like lawyers use the term “cause.”  Simply put, it means a lot of different things to a lot of different people. The idea of “open” intellectual property is not a new one. Many writings and other creative works have been within the public domain for a very long time, or since inception.  You can freely do whatever you want with the text of the Constitution of the United States. You can copy it, store it, modify it, retain it, and even sell it. Millions of other works fall into that category, as do things that are very old and the writers are long dead.  You can rest assured that Chaucer will not be suing you for printing out an excerpt from the Canterbury Tales and passing it out to students.

The idea of OER steems from the idea that people may want to give their work to the world free of charge, but they want to retain some rights.  The most common right retained (no matter what license or lack thereof is used) is the right to attribution.  That is, if you use my work, the least you can do is tell your readers that it is mine. Many in the OER community are zealous about this idea of unfettered information and advocate a sort of information anarchy where knowledge is power, and the power belongs to everyone.    

UNESCO describes OER this way:   “Open Educational Resources (OERs) are any type of educational materials that are in the public domain or introduced with an open license. The nature of these open materials means that anyone can legally and freely copy, use, adapt and re-share them.”  This most certainly gives many potential OER writers and users pause. I refer to this as the expansive definition of open because “freely” covers a vast amount of ground, most of it which is unnecessary for teachers to use the material to provide high quality, no-cost educational materials to students.  The salient characteristics are commonly referred to as the “Five Rs:”

Retain:  This essentially means you own the content and nobody can take it away.  Things like corporate “access codes” that expire four months into a four and a half month term certainly don’t qualify.  If you buy a book, you own the book and thus can retain it.

Reuse.  This means you can use the material in any way you see fit, such as including it in an online course, teaching from it in a classroom, and making a video of you presenting it.

Revise.  This means you can alter the content, such as adding your own content, updating the content, or translating it into another language.

Remix.  This means you can take chunks of the work and add it to chunks of other works.  Many style guides would treat this as a “long quote” but there are no limitations as to length.

Redistribute.  This means you can pass on the original work to others, and also includes your “revisions” and “remixes” of the content.

These ideas are often seen as binary states.  Either you can redistribute the content or you cannot.  Either you can revise the content or you cannot. Creative Commons (CC) licenses do a great job of defining these binary states for your work in both legal and human language.  CC licenses are so ubiquitous in discussions of OER that the two are often conflated by educators wanting to provide students with no-cost materials, but not knowing how to approach the problem.  

I support and applaud the work the Creative Commons folks have done and will continue to do, but I acknowledge that those binary options are not appropriate for every author’s needs and taste. I for one think they are bad for scholars working toward tenure and libraries that need to justify expenses to administrators and legislators.  My OER materials are released under a license of my own devising, and some people wouldn’t consider them “open” by the expansive definition. This leads to my applied definition of “open” that professors considering OER should focus on.  There are really only a couple of criteria:

  1.  Student’s can access the material free of charge, both legally and practically.
  2.  Student’s will always be able to access the material free of charge.         

One fundamental principle of OER is the need of educators to rethink the role of “the textbook” in what we do.  We don’t use textbooks in real scholarship. We analyze many different things (mostly primary sources) and synthesize them into a cohesive whole when we write.  I urge you to think of your classes as being defined by your syllabus and your sense of what is valuable in your context.  If we think that way, we don’t have to worry about finding the “right book” for the course.  There is no telling students “We’ll be skipping chapter five” and sheepishly avoiding the fact that we aren’t going to cover half the book at all.  You may use any combination of primary sources, government documents, YouTube videos, and CC licensed textbooks. The possibilities are limitless.

My point is not to insult the idea of open resources as defined by the expansive view, but to point out that in a practical sense the standard is very high and often difficult to reach by busy teachers struggling to curate high-quality resources for students.  Perhaps the most common example of my point is the use of YouTube videos in online courses. These are not CC licensed; they are most often licensed under YouTube’s “standard” license. This does not hamper your students at all. Provide a link or embed the video and away you go.  My overarching point is don’t overthink CC licenses and limit yourself to CC licensed content. If I want my students to read something on a Nobel Prize winner’s blog, then I’ll provide a link to the content.

One final point:  Most of the material you will want to use was written by scholars with a passion for their fields.  If you email them and ask to use their content in a specific way and you aren’t trying to make money from their hard work, you will get permission most of the time.  The CC license at the bottom of the page is a quick, clear, and unambiguous green flag to use the content, but it is not necessary.


Demystifying Beta

It has often been said that “the market loves certainty.”  Most investors (excluding those who seek to capitalize on volatility) would love it if stocks grew in a nice, linear way that was easy to predict and explain.  Alas, stocks don’t do that. They grow in an up and down pattern that is reminiscent of an EKG readout. All that up and down movement overwhelms the brain, and makes it hard to figure out what is going on over the long run.  Since we can’t get stocks to grow in value as a nice, elegant linear function, we tend to look at trends.   

On graphs, we can often use lines to show what the trend of a particular stock’s value is over time.  One particular method of doing this is a statistical technique called linear regression.  It essentially takes the average of all the ups and downs and draws a line based on those averages.   You could do the same thing with a ruler by “eyeballing it,” but the results wouldn’t be as precise as the trend line and associated equation that is mathematically generated by a computer.  

That last line may have made you cringe a little; I used the words “mathematically” and “equation” in the same sentence.  If you had flashbacks to your high school algebra class, I apologize. But you needn’t be afraid; all of the math is done by computers these days.  All you need to remember from algebra class is that equations can be shown as a line on a graph. Regression analysis capitalizes on this idea in predicting the average movement of data points (stock prices) that don’t move in a nice, straight line like those homework problems from algebra class.  Regression analysis has gotten a bad reputation because of its association with math. Try to forget that; regression is a very useful tool for the investor. All the hard work is done behind the scenes. All you have to do is interpret the results. There are very easy rules of thumbs for interpreting that information.  Feel free to write those down; this isn’t algebra class, and you can’t get in trouble for cheating.

If you were to ask an economist, she would probably say something like “a particular stock’s beta is calculated by dividing the covariance the stock’s returns and the returns of a specified benchmark by the variance of the benchmark’s returns over a specified period.”  My guess is that you didn’t find that very helpful. Let me break it down for you; it’s an easy concept to grasp once we translate the statistical jargon into trader jargon. When we measure anything (such as a stock price) over time and we get different results, we call that thing a variable as opposed to a constant.  Stocks are certainly variable!  

That movement of the measurement from value to value is called variation.  Statisticians measure this variability with a number called variance (closely related to standard deviation).  Simply put, variance is a particular statistic that measures the variation in something that varies, such as a stock price.  In the case of stock prices, low variability (as measured by variance) means that the stock’s price doesn’t move much. A high variance means that the stock’s price is bouncing around all over the place.  Traders don’t often use the word variability; they talk about the amount of movement in a stock’s price in terms of volatility.  It may not be precise, but you will probably be okay thinking of variance as a measure of volatility.

Enter the idea of covariance.  As you’d expect, “co” is a prefix meaning “together.”  So the idea of covariation is the idea that two measurements will vary together and, if we generate a scatterplot, the dots will form a line.  For example, we’d expect a high degree of covariance between a stock’s market price and its price to earnings ratio. If the PE ratio was the only factor in determining stock prices, then all of the dots would fall on the line perfectly.  Statisticians would refer to this is a bivariate (meaning two variables) problem, because there are only two variables being considered.

Stock prices are a multivariate (meaning many variables) problem. There are dozens of potential factors that influence stock prices, and only some of them are quantifiable (If this weren’t the case, I could come up with an equation to model future growth and have retired already).  Note that the idea of covariance is conceptually identical to the idea of correlation.  

So, the big idea of regression analysis is to demonstrate as precisely as possible how two things systematically vary together.  We can apply this idea to see how much the variability (volatility) of a particular stock matches the variability (volatility) of a benchmark.  That is what Beta is. While any benchmark can be plugged into the equation, most often the variance of the S&P 500 is used with stock prices. Beta, then, is just a ratio of the volatility of a particular stock and the volatility of the S&P 500.  The math tweaks (standardizes) the results for easy interpretation. A Beta of 1.0 indicates that the particular stock you are evaluating moves precisely with the benchmark—it goes up and down exactly as does the S&P 500. A Beta less than 1.0 suggests that (at least in the past) the stock was less volatile than the S&P.  A Beta above 1.0 suggest that the stock is more volatile than the S&P.

Consider the idea that volatility is only a bad thing when it goes against the way you bet. If you are long in a stock, and it shoots past the S&P 500 average, then you picked an awesome stock! If it, however, plummets below the level of the S&P 500, then you are a much bigger loser than the overall market.  Beta assesses volatility objectively. What you ultimately decide to do with that information depends on how risk averse you are. Super conservative investors that are willing to tolerate very little risk will look for stocks with a Beta less than one, such as many utility stocks (often referred to as bond market equivalent stocks).

For example, as of this writing, the Beta for Procter & Gamble Co. (PG) is 0.6. Risk takers seeking big rewards will often look for stocks with a high Beta and the accompanying possibility of big returns—and huge losses.  Note that Beta is neutral as to evaluating great returns or terrible returns. As of this writing, the Beta for Goldman Sachs Group Inc. (GS) is 1.6. Owners of GS are springing for the good stuff this Christmas! Apple Inc. (AAPL), on the other hand, has a Beta of 1.3 and that volatility is unwelcome by investors.

To really get any useful information from Beta, there must be a correlation between the stock you are evaluating and the benchmark used in the computations.  To evaluate this, we can turn to another byproduct of regression analysis known lovingly by economists as R-squared. Think of R-squared as a percentage of covariation.  The closer to 100 you get, the more the stock traces the benchmark’s performance. The closer to zero you get, the less correlation there is between your stock and the benchmark.

More advanced measures have been developed since the advent of computer technology, such as the Sharpe Ratio. The bottom line is that Beta and other measures of volatility are useful tools (among many) that you can use to help you pick a stock that meets your investment needs and form realistic appraisals of how high it can go, and how low it can sink.


Demystifying Market Corrections

When the market starts trending down, many investors tend to panic.  They see volatility as dangerous and formulate a belief that the market is just not for them.  These panicked investors perceive the correction as something wrong with the market as a whole and lose sight of the fact that for every stock listed, there is a company behind it.  Veteran investors have come to expect these periodic “corrections” to what can be considered inflated prices. Corrections happen all the time. After “big runs,” you should anticipate them.  It is a terrible mistake to pull out of the market when they happen.

Jim Cramer teaches that a particularly profitable strategy for dealing with corrections is to avoid the trap of being 100% invested in the market at all times.  At times when the market is tanking, cash that makes nearly nothing can look like a great investment. As Cramer put it, “Nothing feels as good as cash when the market is coming down.”  This is actually a critical element of his axiom of “selling strength and buying weakness.” When the market is surging upwards, the strategy dictates that you “trim” here and there to generate cash to be in a position to buy during the next correction.  

If you don’t do this trimming and hold on, you may fall into the trap of selling your winners to subsidize your losers. This naïve practice can wreck a portfolio by filling it with junk because all of the blue chips have been sold off. When you realize that a stock is junk, then sell it and take the loss.  Use what’s left to reposition into something great. The real key to all of this is to differentiate between bad companies with deteriorating fundamentals and good companies with deteriorating stock prices.

Don’t forget that companies with good bottom lines can go bad because of larger forces that are outside of management’s control.  Geopolitics, exchange rates, fed policy, and a host of other factors can make cause a once great company to lose traction. Don’t let your emotions get in the way of making rational decisions based on shifting fundamentals.  In a slowing economy, for example, you may see a consumer shift from premium brands to store brands that can hurt the bottom line of once great premium product companies. Drug companies that have been making fortunes for years can suddenly see their bottom line drop out of sight when a family of important drugs goes off patent.  If you confuse the shifting fundamentals with a market correction and buy more while the stock is “on sale”, you can lose big.

If your portfolio is composed of great companies with great fundamentals, don’t fear the market corrections.  Those great stocks will bounce back, ready to ride the next upturn.

Demystifying Market Sectors

With all of the hoopla over various stock indices, it is sometimes easy to forget that the stock market is a market for individual stocks and not a singular entity that eats fortunes.  These stocks are not merely little pieces of paper (or the digital equivalent); they represent discrete pieces of ownership in living, breathing companies. These companies, taken collectively, do everything under the sun for which people will pay money.  Some stocks represent banks and other financial companies. Some stocks represent restaurants. Some represent clothing stores. Some represent mining, and some represent drilling for oil. When investment experts talk about sectors, they are talking about groups of stocks that have underlying businesses engaged in the same sort of income generating activities.

Because these companies do the same basic thing, they are subject to similar economic forces.   Sectors tend to rise together when economic conditions are good for those types of companies and fall together when economic conditions are bad.  For the investor, this means that poor sector performance can mean poor portfolio performance if you are not diversified across not only different stocks but different sectors of stocks.  Take the financial sector for example. If interest rates are on the rise and all of your investments are in bank stocks, then your whole portfolio will likely rise. If interest rates are cut, then your whole portfolio stands to plummet.  For the individual investor, the best advice is probably to buy the best of breed stocks across as many different sectors as you can.

In the United States, the most common system that you will see for sector classifications is the one used by Nasdaq.   Nasdaq uses the ICB (Industry Classification Benchmark) which is maintained by the FTSE Group. This system uses a hierarchical approach in which there are ten “industries” at the topmost level, 19 “supersectors” below that, and 41 “sectors” below that, and the fourth level with 114 “subsectors.” (You can download an Excel spreadsheet of this information from  http://www.icbenchmark.com/structure)  Be aware that these sector classifications may change depending on which information service you use.  When I look up the symbol TST on TD Ameritrade, it tells me that it is classified as “Financials: Capital Markets.”  When I look it up on Yahoo! Finance, I find that it is classified as being in the “Internet Information Providers” industry and in the “Technology” Sector.

There are several indices and ETF (Exchange Traded Funds) that are sector based, allowing you to invest across a wide swath of stocks in a particular sector.  The sector ETFs are like broader index funds that only provide exposure to one sector rather than the entire market. Understanding a particular sector is important when picking stocks.  Different businesses measure success in different ways, and if you don’t know how to tell if a particular breed of business is successful, you obviously shouldn’t speculate in that sector’s stocks.  

A great way to find information about investing in a particular sector is to read the research done by high-quality research and investment firms, and of course by following the sector on TheStreet. Always remember that the fortunes of individual stocks are tied to sector evaluations, largely because of these behemoth sector funds.  Perfectly solid companies with a stellar trajectory can take a huge hit if investors (especially the big institutional ones) dump the entire sector, just as they can when there is an overall market decline like the Great Recession. If you have done your homework, evaluated the fundamentals, and have conviction about the company’s story, then sector selloffs present an important buying opportunity.

Demystifying Price Targets

One of the most important predictors of short-term stock prices is the backing of market analysts in the form of buy, sell, and hold ratings.  Many analysts include a price target in their reports on particular stocks. These target prices are merely an analyst’s best guess as to the future price that a stock will reach.  Your online brokerage account will usually link to several such reports. It is important to realize that different analysts determine price targets in different ways. Some are purely quantitative and use regression analysis and other more advanced empirical techniques to predict future prices.  

An important consideration when considering price targets is the timeframe that the analyst is using.  Most analysts are not catering to the day trader; they are usually looking at least a year into the future.  Most analysts will use some form of value approach where the earnings and growth of the underlying company are the critical factors.  This means that sector fluctuations, overall market conditions, and market sentiment are not factored into the price model. If a particular stock rises to the analyst’s target, it will usually have many ups and downs before it gets there.  

Many investment experts agree that a buy and hold strategy is not a smart investment strategy.  Even the greatest companies wane over time. You have to keep an eye on the fundamentals of the stocks you own.  Price targets provide one method of helping you determine when a company may have reached its zenith and your portfolio would be better served by taking profit and reinvesting in a stock that still has “room to run.”  Note that price targets are often updated by the analysts, and analysts may not update a price target as quickly as they perhaps should. When a stock reaches its current price target, then it should be evaluated as a possible sell.

Value investors will seek to identify companies that are selling at a price that is too low given the fundamentals of the underlying company.  Analysts trying to establish price targets must make predictions based on past performance and future potential. Some of the most commonly cited factors that influence a stock’s valuation include its expected growth rate, dividend yield, and financial health.  The idea of “earnings visibility” often comes into play. Earnings visibility refers to the likelihood that projections about a company’s numbers are correct. Factors that make the analyst’s crystal ball cloudy, such as regulatory uncertainty, hamper earning visibility.  Older companies that have weathered economic downturns are often regarded as safer bets based on this idea of visibility.

Many analysts also include whether or not a company pays a dividend when evaluating the market value of a stock.  As you would expect, when all else is equal a company that pays a dividend should trade at a premium to a company that pays no dividend.  Dividends provide investors with tangible growth. Some companies have such a long history of paying stable dividends that they are considered to be the equity market equivalent to bonds.  Proctor and Gamble is the quintessential example of such a stalwart company.

A company’s financial profile must be considered when valuing a stock. Earnings and earnings growth only tell part of the story.  The best of breed companies will have a high return on equity and a high return on invested capital. They will also have good “margins,” which are the proportion of income that’s left over after all of the bills are paid.  The adage that “you have to spend money to make money” is true. Great companies have mastered the art of turning cash into even more cash. Some companies have to spend huge sums to generate cash, and the market punishes companies that have a track record of doing this.

This brief overview of how price targets are set demonstrates that setting price targets is as much art as science.  The simple truth is that even the best analysts get it wrong sometimes. This is an important reason that you must be diligent and do your homework.  You have to know the fundamentals of the companies that underlie your stock picks, and you must project those fundamentals into the future based on a comprehensive knowledge of what the company is doing.  Qualitative information about a company is often critically important, and cannot be boiled down into a single number that plugs into a formula.


Demystifying the Financials Sector

Stock market analysts often worry about market volatility, which is jargon for a rapid cycling of upward and downward trends in a particular stock, sector, or index. Some intrepid traders use this sort of volatility to capitalize on very quick downturns to buy and quick upturns to sell. This strategy is considered very dangerous and full of risk by most long-term investors. They view it as something akin to playing roulette.

Many day trading strategies exist to help these brave traders make these volatility based plays. The long-term investors hate volatility because it makes the market less certain and picking stocks more difficult. As you may have predicted, the financial services sector is prone to high levels of volatility. The reason that everyone doesn’t avoid high volatility stocks and stock sectors? Great risk often equates with great reward in the stock game.

If a company is in the business of handling other people’s money, then it likely fits into the financial industry. Banks, insurance companies, real estate companies, and financial service companies are the major supersectors in this industry. To get an understanding of exactly what makes stocks rise and fall in this industry, you must have a grip on each of the subsectors.

As a general rule, financials perform best in a low-interest rate environment, but that statement must be qualified. The value of long-term debt such as mortgages is higher when interest rates are lower. In periods of low-interest rate mortgages (and other long-term debt), the general rule will not hold. The complex interaction of current interest rates and long-term interest rates are part of what makes this sector so potentially volatile.

When the business cycle is on the upswing, and there is a high level of confidence in the economy, both individuals and businesses seek to expand wealth. This is often accomplished through growth, which means that these individuals and corporations need financing. Businesses build and replace infrastructure, and individuals increase personal savings and investing.

These are heady days to be invested in the financial sector! Financials make up a large portion of the S&P 500, consisting of household names like Bank of America, Citigroup, and JP Morgan Chase. If you were invested in these financials at the beginning of the Trump Rally, you fared very well! (See a chart for November through December of 2016 for Goldman Sachs Group Inc. (GS) if you are a visual learner). Investors can’t afford to become complacent given these meteoric rises in equity. Never forget the devastating losses to this sector when the real estate bubble burst in 2008.
Several investors use the Financial Select Sector SPDR ETF (XLF) to track the overall health of this sector.

The volatility of that index has been quite low over the past three years (Beta = 0.93), debunking the notion that financials is a volatile sector that should be avoided by prudent investors. History teaches us that the sector’s volatility can, however, increase dramatically during uncertain economic times. In this sector, the prudent investor will shy away from a “buy and hold” strategy. The correct strategy is to follow the advice of Mr. Cramer: “Buy and Homework.” That homework must include drilling deep into the underlying business and inspecting the balance sheets before you pull the trigger. It also means tracking the larger economy, with a special emphasis on what the Fed is doing with interest rates.


Demystifying the Oil and Gas Sector

As with most of the industries and sectors that stock investors may seek to invest in, the oil and gas sector is often intimidating because of the massive amount of jargon that is involved.  Another layer of complexity is added by the global nature of the oil market and the political nature of international relations in historically volatile regions. It is important to realize that oil and gas are commodities.  Supply and demand economics rules commodity prices. When there is a surplus of oil or gas, prices tend to go down. When demand is high, and supply is too low to meet it, then prices climb sharply. These fluctuations in commodity prices have an enormous impact on the bottom line of companies in this sector.  

Oil and gas are sometimes referred to as hydrocarbons because of their shared chemistry.  They are commonly referred to as “fossil fuels” because of the way they originated. The basic idea is that ancient plant and animal life were covered over by sediment, and this sediment later formed into rock.  Add a few million years, and presto: you get natural gas and crude oil. The first thing this tells us is that oil deposits are hard to find because they are in the ground, buried under hundreds or thousands of feet of rock.  In the case of offshore deposits, you can’t even get to the rocks without going through hundreds of feet of water. The sector that is most closely associated with finding the oil and gas in the first place is usually called exploration.   This exploration and production end of things are dangerous; if the geologists get it wrong and the hole is dry, then many millions of dollars have been wasted.  Many of these companies take the raw commodities out of the ground and turn them into the useful products, such as gasoline, that people want to buy; this is most often referred to as refining.  Some companies get involved in the retail and distribution end of oil and gas as well, and these companies are usually classified in the “Integrated Oil and Gas” sector.  

Another important sector in the oil industry is the “Oil Equipment, Services, and Distribution” sector.  Getting millions and millions of gallons of oil and natural gas refined and to retail markets is a massive undertaking.  Many companies provide tools, equipment, chemicals and so forth to the oil exploration and production companies. E&P companies often farm out the actual drilling of the wells to drilling companies.  Drilling companies earn profits based on contracts and are not tied directly to the price of oil as are the E&P companies. Most such companies get lumped together into the “Oil Equipment and Services” subsector, but “Pipelines” are such a big deal that they get a subsector designation.

Most of the companies that explore for oil and gas also drill down to find it and bring it to the surface, a process called production.  Exploration and Production (E&P) company stocks sell at a premium when oil prices are high, and tend to sell at a discount when oil prices are low.  The balance sheets of these companies are composed of line items directly related to drilling for oil and gas and getting it out of the ground once it is found.  This means that investors in this sector must be familiar with the terminology and jargon of E&P as part of their homework on investing in the “oil patch.” As with any commodity, profits are made by volume of sales.  Wheat and corn sell by the bushel, and oil sells by the barrel (42 U.S. Gallons). Natural gas, on the other hand, is sold by the Cubic Foot (at a standard temperature and pressure).

Another important set of jargon you need to understand before investing in the oil patch is the difference between upstream, midstream, and downstream.  The term “upstream” is used to refer to the source of the oil or gas; the E&P side of things. The midstream is focused on storage and transportation. Finally, the downstream side refers to the refining and distribution of refined products.  For example, a drilling rig in Alaska would represent an “upstream” activity. The transportation of the oil from that well via a pipeline would be midstream activity. The refining and sale of gasoline would be downstream activity. These distinctions are important because they provide different potential risks and rewards for the investor.       

Just as with any other company, the value of an E&P company stock is directly related to its predicted future earning capacity.  These companies have a finite amount of oil or gas that they can pull out of the ground given all of the wells they have currently producing.  These still in the ground reservoirs are key to the valuation of E&P companies. Oil companies must always be exploring for new reserves or face bankruptcy.  Note that reserves in the oil patch are different than a company’s expected earnings.

Curiously, oil patch investors pay close attention to the “netback.” or profit per barrel of a particular production operation.  That is what it costs to get a barrel of oil to the retail market is subtracted from what the products sold for. Companies with a higher netback tend to sell at a premium while companies with a low netback tend to sell at a discount.  Netback rises when costs can be cut at any point from initial exploration to the final sale at the gas pump. These factors have historically been very predictable, with the American oil industry suffering when the sale price of a barrel of oil was low.  If it takes an American company $75 to get a barrel of oil to market and the price of oil is at $50, then obviously these companies cannot be profitable on the domestic side of the business.

Technology has already played a major role in improving the viability of American E&P companies.  We have become better at finding oil and gas, we’ve gotten better at getting it out of the ground, and we’ve become more economical at getting it to market.  When American E&P CEOs are telling investors that they can make a comfortable profit in the $45 per barrel range and oil is selling at $50, then it is a potentially exciting time for investors.

Note that the “Oil and Gas” industry would probably have been better named the “Energy” sector, and that’s what a lot of investors call it.  One reason for this is the fact that the “Alternative Energy” sector is within the “Oil and Gas” sector, creating an oxymoron. When we talk about “Alternative Energy” we are talking about alternatives to oil and gas.  The two major subsectors in this sector are “Renewable Energy Equipment” and “Alternative Fuels.”

Most companies tied to solar and wind will be tied to the equipment subsector, and oil and gas alternatives such as ethanol and fuel cells will be tied to the fuels subsector.  Most of the companies in this sector are very speculative and not suitable for the long-term value investor. So long as crude is selling for $50 or less, then the alternative sector has a long way to go before it can become broadly competitive. During periods of “environmentally friendly” politics and policy, government incentives make this sector seem more attractive.  During periods of pro-business policy and deregulation, oil and gas will be king, at least for the foreseeable future.


Demystifying “You Can’t Beat the Market”

As you know, economists are scientific types that study how money flows and grows. They study many different aspects of this very broad concept, including how stock market prices behave. Being scientific types, they love to develop theories that explain many different “phenomena.” One particularly important theory that has been the foundation of America’s retirement strategy since the invention of the 401K is the Efficient Market Hypothesis (EMH).

In its simplest form, the theory holds that the price that stocks sell for factors in all available information, and thus no trader can have any real advantage over the other. The idea of efficiency comes into play because the market has already priced each stock given all of the relevant information. If stocks are priced to reflect this efficiency, then it is impossible to pick undervalued stocks to buy and overvalued stocks to sell because those things cannot exist in an efficient market.

This translates into the following investment advice: Stick all of your money into a low-cost mutual fund that reflects the entire market and hope that the overall economy grows. This is the strategy that many 401K “experts” dole out to folks hoping to have enough money to retire one day. Put some in an Index Fund, put some in bonds, and put some in annuities, then wait for retirement. Any scientist (I classify economics as a social science) will tell you that it is much easier to disprove a theory than it is to prove one.

To disprove a theory, all you need are some examples of it not working. The EMH doesn’t hold up in the face of short-term performance legends like Jim Cramer. It doesn’t hold up well in the long run when we consider the mind-boggling success of the long-term value plays of Warren Buffet. Unless you believe that Mr. Cramer and Mr. Buffet are magical creatures, then you must reject the EMH as an absolute law of economics.

In fairness to the supporters of the EMH, it must be acknowledged that any randomly selected portfolio has a very small chance of beating the market by a substantial margin. The laws of probability are clear on that one. Any fund managers that beat the market for a quarter or even a year may well have random chance to thank for their success. In everyday language, this probabilistic growth would be a case of “getting lucky.” Another law of probability tells us that improbable events occurring in a long series become very, very improbable.

Take poker as an example. If we are playing and you get a full house, I’ll say that you got lucky. If you get a full house twenty times in a row, I’ll say you are cheating because that happening by chance is just too improbable—nobody is that lucky. When I look at the careers of great investors like Jim Cramer, Warren Buffet, and Peter Lynch, I must reject the efficient market hypothesis. There are other lines of attack on the theory, such as the impossibility of a “market correction” if the hypothesis is true, but I hope I’ve made my point.

My ultimate conclusion is that you can indeed perform better than the overall market. I would be remiss if I didn’t point out that doing so is not easy. While I’ve argued against the EMH, I will say that it is mostly accurate most of the time. Most stocks do trade right around where they should. Finding an undervalued stock is wonderfully hard work. Identifying a catalyst that will send it upward is more work still. Even so, with a little luck and a lot of homework, you can beat the market.