# Characteristics of Some High Frequency Trading Algorithms

Gilbert "Gman" Mendez

It seems the chatter about High Frequency Algorithms has gone down a fair amount. Maybe as intraday traders we have become desensitized to their shenanigans and have learned to live with the frustration they bring to our business. Or maybe there really isn’t much to talk about. How many times do we need to hear the schpeel about how they make the market more efficient? yada, yada, yada. Instead of writing – complaining even – about the HFTs I want to talk about some of their characteristics.

I have been a bit out of the loop in the past few days due to health reasons. I am currently battling a silly case of the flu. And let me tell you there are only so many 16-hours days of sleeping I can handle. So as I lay here awake in the middle of the night staring at lady liberty I can’t help myself to think about how these dopey programs work. I can’t imagine them being that complicated. I mean speed is the name of the game, so overly complicated mathematical computations are out of the question.

Thinking back on my engineering days and the cheerful lectures (sarcasm?) on computational nonsense I recall a few things. Linear systems are easy to program and computational friendly. From a mathematical standpoint this boils down to average price and rate of change in price of a stock (think fear/greed thermometers if you will). Further, common sense tells us these algorithms are dependent on volume and liquidity to run their show. So an accumulation/noise algorithm tries to keep a low rate of change in the stock while there is light volume, giving a chance to a magical moving average to catch up to it provided there is enough liquidity in the stock.

I know I have now lost some of you with all this mathematical gibberish. The point I’m trying to make is simple. The more controlled a stock is (low rate of change) the more likely it is for HFT algorithms to run us over. Think of consolidations, these were levels where we as traders would take considerable positions in anticipation of bigger moves. We would over-leverage our positions knowing our risk was well defined. As the rate of change in price would pick up in the opposite direction we had thought we would exit assuming we were wrong.

But what if the program would just drop the level right before the magical average was about to catch up to it? Then as traders start to puke positions the algorithm is able to get significant volume at a slight discount. Those who just exit their positions realize what happened and scramble to get back in now altering the rate of change in the stock all together leading to the actual move. Just “simple and elegant” as my calculus professor would say.

How this is useful information to trading is the important part of the puzzle. I chose to trade a bit aggressively during times of high volatility (steep rate of change) and when there isn’t much or too much liquidity for programs to run their show. That only seems to take place in premarket, after hours and the first 15-20 minutes of the open. At all other times I am very careful of not getting in plays that “seem too obvious” or when the rate of change seems to be about flat. I rather wait for the volatility and volume to manifest itself to come out and play.

That doesn’t mean I am suggesting that you should consider start chasing moves in a stock as profitable strategy. I am suggesting that those struggling should consider developing momentum trading skills. I am also suggesting being cautious around points of low volatility.  These are the levels where stocks can be often “manipulated” or as the SEC would put it, more efficient. For now, I am off to pop some Nyquil to make my immune system more efficient hoping I can sleep this off and make it to the open tomorrow. Happy Holidays!