Five Minutes With Jitesh Thakkar

From MarketsWiki
Jump to: navigation, search

Five Minutes with Jitesh Thakkar from Edge Financial Technologies

float

Beyond algorithmic trading, there are services such as Cloud Computing and hardware acceleration that optimize your trading operations. These technological advancements are revolutionizing the financial industry and leaving market participants scrambling to understand the 'next big thing.' MarketsWiki’s Jessica Titlebaum sat down with Jitesh Thakkar, President of Edge Financial Technologies to discuss the newest concepts, the most recent developments and to dissect how all of this technology benefits the end-user.

Q. How has the role of technology changed within trading?

A. Technology has taken a much more prominent role in trading. I would say 10 years ago technology was around, but now it is the central piece of a strategy which for many strategies is the difference between a profit and a loss. I see heavy investment in technology by all major hedge funds and proprietary trading firms. Strategy is of course important, but next to strategy, the right technology will make the difference between where you stand among your competitors.

Q. What are some of the technologies in which you see people investing?

A. I see increased trends toward direct exchange connectivity and an increased trend in low latency. This is overall in various parts of the trading architecture, which includes software, hardware, networking and collocation. Those are the four major components.

Q. What are some of the risks and benefits of algorithmic trading?

A. By eliminating the human factor, algorithms are run by computers and make the same decision every time. Your trades are optimized, they are more efficient and your profits can become more consistent.

Another point is that you have the ability to back test your strategy over historical data to figure out if your strategy is profitable. You can determine how much money the strategy is making and how it’s reacting to the market over different types of conditions. It really gives you a chance to tweak your strategy without costing you any money in the markets.

There are risks associated with algorithmic trading. Just like computers, algorithms are only as good as the people that design them. Markets can be volatile and algorithms will only do what they are supposed to do, so you can lose money if a situation occurs that was unexpected by the creator of the algorithm.

Q. How can traders use other technologies to increase profits?

A. It really builds on the previous question regarding algorithmic trading. Utilizing back testing as a tool and using software that helps you optimize your algorithms to increase profits is one way. You can use tools and software packages to turn an idea into an algorithm and then back test it to make sure it’s profitable. You are also able to tweak and fine-tune the strategy to increase profits and decrease your losing trades.

Q. What specific software packages are you seeing people utilize?

A. I see people using TradeStation or CQG to quickly create their algorithm and back-test it over data 6 – 12 months old. These packages provide historical data so you can automatically set up one algorithm to go live and the other to test historical data. These software packages give you automatic charts and tables of how many trades you made versus how many were profitable. You can dig deeper and find out what percent of trades were profitable and then there could be optimizers built in so you can fine-tune your algorithms to make them profitable.

Other things I have seen traders do in terms of technology is use macros or software to actually increase what they do manually. For example, algorithmic trading is great and used by a huge group of people but there is another group of people that came off the floor and that just started screen trading. They still want control of the trade and use a screen or even a mouse to send trades manually. This group of people uses trading technology to make that process better and more efficient. They will do two things – use keyboard shortcuts or use macros. Basically macros are just a bunch of things that you would do manually, so let’s say there is step 1, step 2 and step 3. You would combine all those things into one command and by pressing that one command; you would be able to do a group of things manually.

Q. What types of projects do you see a lot of?

A. Our projects mostly revolve around direct market access and algorithmic trading. We see a lot of projects that require taking manual trading strategies and automating them. Another big aspect is taking existing strategies and optimizing the strategy itself for lower latency. We would look at the logic of the algorithms to increase profits.

We have been getting a lot of requests for risk management technology especially after the “flash crash.” Many people are interested in risk management and post-trade risk management. People are increasingly asking for solutions that let them view their open orders in real-time across multiple exchanges and across multiple trading software. I also see increased focus on pre-trade risk controls.

Q. What are some of the challenges you see with technology?

A. The cost is going to continue to rise because there is so much out there and there is a need for good talent, good developers and good network engineers. Technologists in general are hard to find.

For a small firm to get started, the cost has risen significantly because there are so many aspects of trading technology. As mentioned before, there are different components to consider such as collocation, which needs to be managed, a networking component that needs to be managed and then there is the hardware and software itself. These aspects need to be managed and maintained as well.

In regards to low latency, this is an arms race that has been going on for a while. It is becoming increasing difficult to lower latency below a certain point. The cost of lowering latency just a millisecond has increased over the last year.

Q. Why is it important to shave off one or two milliseconds?

A. It is important for traders that trade in Chicago and New York. You need to get all this data from NYSE Liffe or Nasdaq to Chicago because a lot of traders trade products from New York exchanges against the CME or ICE. If you are trading at both, either you have your servers in Chicago in which case you are at a good speed trading in Chicago but it takes time to get to New York or the other way around.

Even if a firm puts servers in New York and Chicago, their servers need to communicate and that communication still takes 15 milliseconds to send a message one way. It’s a race because many people are trying to make the same trade. If you can get your data from New York to Chicago one millisecond faster than the other guy, you can act on that information faster.

Q. What is the next big thing in technology?

A. Hardware acceleration is good for a small group of people. There are competing technologies in that space such as FPGA, which is when you put software on to hardware, such as putting it on a chip because it runs faster. For example, one piece of software may take one millisecond to process on the CPU; if you use FPGA it will take a tenth of that time. So again, it gives you that competitive edge. You can react to market data faster and send your orders faster.

There is also something called Graphical Processing Unit or GPU. NVIDIA has specialized cards that have a few hundred GPUs on the card. The idea is that instead of doing all your processing on one big Central Processing Unit (CPU) you distribute your processing on a few hundred GPUs and do it in parallel.

Q. How would that benefit the end-user?

A. There are certain types of things that you can do in parallel. For example, options pricing. Options traders do a lot of mathematical calculations based on matrixes, which are a set of numbers. A lot of these things need to be done in parallel especially for options because when one stock price changes it affects a whole bunch of options that rely on it and you have to recalculate the Greeks at the same time.

Regarding emerging technologies, another ‘next big thing’ is cloud computing. I see the use of Cloud Computing increasing and one big use is storing historical data for back testing as well as charting.

Q. What is Cloud Computing?

A. The concept involves using software, hardware, servers and all this capacity without paying full price for it.

For example, when you use Microsoft Word, you usually buy a copy of Microsoft Word, you own it, install it on your computer and it’s yours. You pay a base price upfront.

If you were using the cloud computing concept, you would pay for Microsoft Word on a per use basis, the way you do your phone bill or electricity. So let’s say you write 20 documents a month, 50 pages per month. Microsoft would charge you per page, whatever their charge is. So instead of paying the $300 up front, you pay much less but you pay per usage.

Utilizing cloud computing will lower the cost of entry for end-users. That example is for software but the same thing goes for hardware and servers.

The providers of the hardware or servers have to set up the infrastructure. They are the ones that make the investment on behalf of their clients. They offer it as a utility.

Q. Is the cloud in ‘Cloud Computing’ really a cloud or is it a distributed cloud-like internal system?

A. The cloud came from a network architecture diagram. If you look at how the Internet is architected, it is a set of distributed computers connected to each other. If you take a closer look, there is something called a backbone where huge computers route traffic to each other and they are central to the Internet. Whenever they draw diagrams they draw a line of connected computers and a cloud that encompasses everything.

Q. How would you use cloud computing for charting?

A. It’s for getting historical charts and you get the information from the cloud. It is much more cost effective than having a lot of local storage. If you want six months worth of charts and six months worth of data, it is much more economical to use cloud computing for that.

Q. Are these developments fads or here to stay?

Ultra low latency is definitely here to stay. It is going to continue to a point where it will push barriers. These developments are new now but over a few years, ultra low-latency and hardware acceleration will be a commodity.