A vital enabler of these firms is leverage through margin financing. There wouldn't be quant trading firms without leverage, extended by a prime broker. Larger firms deploy fairly interesting, complex arrangements across multiple prime brokers. You don't want a pb to know your entire trading strategy. The PB, on the other hand, wants self-funding accounts, featuring hard-to-borrows and other types of valuable securities as collateral that can be rehypothecated and used by its securities lending businesses. Finding a viable leverage policy across multiple prime brokers is a nice, juicy challenge for the problem solvers.
> There wouldn't be quant trading firms without leverage, extended by a prime broker.
Please allow me to disagree :) Most larger pure-algo HFT prop shops are self-clearing, they are exchange members (possibly with more favorable fees due to market making) and don't use pbs, instead relying on their own co-located execution infrastructure and funding.
This is true for, as you said, large hft shops. But there are thousands of quant trading firms that are not on this scale and will not be an exchange member. You can see all the NYSE exhchange members here [0], and it's a very small portion of all quant funds.
Managing margin and leverage between prime brokers is an important job for the backoffice at the fund I work for. These numbers are watched very closely, and we have strong, multiyear relationships with the people who work at these PBs.
It really depends on the trade, and what margin offsets you can use. Simplest case is everything in a single exchange, with full margin offsets. If it is split in a way that several exchanges or regulators require capital to be posted, then you need to sort out financing somehow.
A lot of that trading activity is still leveraged even if self-clearing. Futures/options trades are based on margin for instance, e.g. $4800 ES margin vs. $140,000 (50x2800) notional means 29:1 leverage.
And how could you trust them? Trades are fairly anonymous at exchanges. All you could see is that your strategy doesn't work as good anymore, but you couldn't tell why exactly and how they got this strategy.
Like copying a good book, copying a successful strategy is labor intensive, takes a lot of time to get right, and is rather large and involved. (This is true of copying a strategy; copying a particular set of trades is trivial if you know the trades).
Being a good prime broker is also a lot of work, and is an elaborate and involved process.
Now for the firm to take on two jobs, and then keep it secret... I wouldn't worry about it. It's like worrying your accountant is going to take your successful startup idea and run with it because they know how much money you make.
On Canadian exchanges, both counterparties are part of the market trade data, by law.
And in all markets, that's one reason why many firms invest in transaction cost and best execution analysis, to make sure they're not getting front-run when they see abnormal dips in broker quality.
This is substantially higher information density than anything I've read in the last week. Learning about triangular arbitrage programmatic trading alone looks like it's going to occupy most of the next few days for me.
Btw, people are doing lots of arbitrage with crypto currencies as well if you weren’t already aware. It could be a fun way to play with it without spending much if you’re interested.
the problem with cryptocurrency arbitrage is that the profit depends on how fast/reliable the various exchanges process your transactions and how much the transaction fees are and not how sophisticated your algorithm is.
Maybe nit-picking ... but shouldn't a very sophisticated algorithm precisely incorporate the reliability/speed of the various exchanges (maybe based on past and current latency or downtime which weighs the risk of a transaction not clearing at a given exchange). The smartest algorithm should be able to win such a battle on average.
Until/unless the crypto market in general crashes, or the gains have all been arbitraged out, at which point we'll see a flood of Medium posts about how someone bought their house by arbing DOGE.
Can't speak for parent, but when I find an interesting subject I'll read up on it. I Might even buy books in the field.
While my primary motivation is curiosity, you never know when knowing a specific thing might come in handy. Or you might be able to apply some principles in a different field.
Of course time is limited, so when deciding if I'm going to invest the time I do a quick check on ROI. How much time do I think I'll have to invest. This depends on how much I know in the general area of the subject and how specialized I assume the subject to be. So basically how much time do I have to spend to learn the required vocabulary. I value that against how useful I assume the knowledge to be. Can I directly apply the new knowledge, or do I assume it to have synergies with my main subject of interest.
But sometimes a subject is so interesting to me that learning about it is its own reward.
Three days of reading on a specialist applied subject (like quantitative finance) which is super-competitive (because huge money, and literally competing against other market participants) and very high barrier to entry makes it seem like the reward is very low. If that same time were spent in general machine learning techniques, for example, it would have an outsize return as it could then be applied to any field. Indeed it is possible that minimal sophistication might yield excellent returns on the market if one is a machine learning expert, trains a model on historical data, and then applies its predictions using a pretty dumb web interface and no sophisticated stack at all. This is why I'm surprised grandparent is investing the time.
This is a very sensible mental model. the time to learn something increases in cost as we move through lifestages, and this ROI calculator is deeply useful for me. however I really do think i need to bias for action as it can easily be a procrastinator excuse
Sorry, what does "this" refer to when you write "This is a very sensible mental model"?
I'm trying to read your whole comment, it's interesting especially since most hackers are somewhat self-taught and there are many fields we can become proficient in (such as dozens of resume keywords) if we want to invest the time.
So I'm interested in how people make these choices. Could you elaborate?
this was the downvoted parent saying "I can't go learn everything so i look at the likely ROI. Learning hard complex up and down stack optimisation that benefits a few hundred companies globally and they are already massively oversubscribed with brilliant 25 year old mathematicians is not a good ROI".
but take a month of hourly commutes in learning basics of NLP or bootstrap or PGP or ... is likely to have much higher ROI
my mental model is
- There are few fields that are wide open right now - it's ML (vision 90%, text 50%), process automation, robotics, security.
- It's got to be widely applicable to more than one industry
- It's got to not involve going back to college for a year of maths training.
- It needs some kind of moat. Learning to use bootstrap better is nice cos your stuff looks better and is cleaner. It's kind of table stakes, but it has zero most.
Intel secure enclaves looks interesting to dive into and will have some decent moat factor.
- it's really got to be uISV suitable. This means I can come up with at least two businesses that conceivably will work, selling element extraction from text is not a business but "capturing ten kpis from your base of legal contracts" is (and probably a hotly contested area)
I did the whole "walking through lists of industry SICs to generate ideas - even read every ycombinator investment. no dice" Good business opportunities are hard to imagine.
GP here. I have no plans to work in this area, and have a stable employment/financial situation that I don't anticipate changing any time soon. My learning in this area is purely for enjoyment.
This is a good write-up but it's also missing huge other sections of quant finance like statistical arbitrate and factor investing. I guess you could put this under the 'market taking' section except everything he describes is still in the mode of single stock thinking while these strategies are more about portfolios. You typically estimate some factor model of the market and use a portfolio optimizer to create your trade baskets.
In these types of strategies you care about latency but on the order of milliseconds not microseconds. The challenge is in building multi-day predictive alpha models with low correlation to each other, getting good executions though brokers can do a reasonable job these days, and especially combining alphas into one book which requires sophisticated mathematical programming (conic programming etc)
Many companies make a profit by doing something that benefits society in at least some way, society pays them for their service. E.g. selling electronics: the company makes a profit, but society benefits by being able to buy and use those electronics.
What about quantitative trading firms. They make a profit themselves. But how do they benefit society?
Here's a very simple service that everyone would understand.
Suppose you are a customer with a connection to an exchange. You can't afford to connect to every exchange, because that would cost you a small fortune.
If you stick your order into this exchange, you might have to wait around for someone to meet your price. In fact, there might be someone willing to trade with you at your price, but you don't know it because you're on different exchanges. (Let's not get into NMS.)
An HFT with a view of the whole market would help you distribute your order to all exchanges. They do this by leaning on your order while posting everywhere else. So if someone came in and met the HFT's order in some market, the HFT would turn around and fill you immediately for a small difference, and they'd pull all the other orders when this happened.
That's a service, and you pay for it.
A lot of HFT strategies provide a service similar to this in some way.
Oh come on, if order books were generally accessible I could just see where stuff is traded right now and sell there by myself. It's just that market data for stocks is so heavily gated that HFT guys already have an edge by just buying data from all the excahnges for huge piles of cash.
If this was such a great service why is IEX becoming so big?
HFT is a nightmare from a CSR view. Those companies provide no real value to society (not by any standard) and even worse: By hiring all those talented young people with degrees in various fields they actually hurt society. Give those people real problems to work on, instead of wasting them on optimizing those HFT systems.
> If this was such a great service why is IEX becoming so big?
This is not an informed thesis. This is just an appeal reducible to the following statement: “A thing cannot be a good service if it has a large and organized opposition.”
>HFT is a nightmare from a CSR view. Those companies provide no real value to society (not by any standard) and even worse: By hiring all those talented young people with degrees in various fields they actually hurt society.
If your thesis is that HFT provides no benefits for society, please provide a substantive rebuttal against the claim that it improves liquidity and price discovery.
HFT increases the availability of bids and asks in the orderbook by rapidly volunteering to be the trading counterparty in otherwise inefficient markets. By seeking the best price for its own strategies and making as many trades as possible with as little latency as possible, the increased liquidity is accompanied by improved price discovery.
> Give those people real problems to work on, instead of wasting them on optimizing those HFT systems.
Please define “real problem”, and please explain how the aforementioned benefits contributed by HFT are incompatible with this definition. Meanwhile keep in mind that the vast majority of companies, including those in tech, likely do not work on “real problems”, so next I’m going to ask why this is a meaningful criticism in the first place.
IEX allows and has major liquidity providing HFT firms & probably couldn’t operate without them.
Also IEX is a very small exchange, not one getting huge. Their PR firm is impressive not their size.
Finally there are lots of trading options that allow you multi-exchange order book access. Getting market access is fairly cheap. The expensive part is the counter party risk mitigation. You need someone to prove you are good for the trades you say you are. That has nothing to do with HFT or electronic trading at all.
> Oh come on, if order books were generally accessible I could just see where stuff is traded right now and sell there by myself.
You'd have to write a feed handler for every exchange, and a real-time composite orderbook so you could see where the best order was at every point in time. For every stock. You'd need to rent the fastest network as well.
This is no different from "I could tile my own roof, what do roofers do that is so useful to society?"
> Those companies provide no real value to society
A lot of businesses provide no real value to society. The only way to do that is to interact with a large portion of society. So restaurants for instance only serve the local market, no real value. I mean who benefits, other than the people who eat there? I could go on with a whole load of other businesses.
> You'd have to write a feed handler for every exchange, and a real-time composite orderbook so you could see where the best order was at every point in time. For every stock. You'd need to rent the fastest network as well.
Why would I need to rent a fast network for every stock? I didn't say abolish the exchange, I said make the data open, generally available so I can see for myself whats going on. Also smart order routers are there already, so no I don't have to write them again, every other broker already offers them. Now just cut out the HFT frontrunners and I can actually the price I see on the terminal.
> A lot of businesses provide no real value to society. The only way to do that is to interact with a large portion of society. So restaurants for instance only serve the local market, no real value. I mean who benefits, other than the people who eat there? I could go on with a whole load of other businesses.
That may be right, but that doesn't make HFT any better.
> Now just cut out the HFT frontrunners and I can actually the price I see on the terminal.
High frequency trading is not front-running. High frequency traders 1) do not have a fiduciary duty to other traders in the market, which is a hard (and definitional) requirement for front-running; and 2) cannot see a retail investor’s order before it is executed on an exchange. They can only alter their prices in reaction to orders that have already executed.
> I didn't say abolish the exchange, I said make the data open, generally available so I can see for myself whats going on. Also smart order routers are there already, so no I don't have to write them again, every other broker already offers them.
1. This data is already available for purchase. There’s nothing intrinsically preventing you from acquiring it as an individual. I’ve personally purchased this data as an individual.
2. Even if you can “see for yourself what’s happening”, without the extremely fast market making provided by HFT you’ll be waiting significantly longer just to find a trading counterparty. Your proposal sacrifices the liquidity and price efficiency provided by HFT in exchange for freely available market data, which the vast majority of retail investors will not be able to use (let alone want to).
3. Smart order routing is possible because there is sufficient inter-exchange liquidity. In fact, it is directly facilited by HFT.
I’m not often this blunt on Hacker News, but in this case I think it’s warranted: you do not appear to have even basic familiarity with what you’re criticizing.
Glad this question came up, because I am always wondering the same. I also was approached several times to work in HFT as a software dev by some of the big firms. My subjective moral compass just seems to refuse anything having to do with such a seemingly corrupt and inhumane endeavor, although the technical challenges must be very interesting.
The greed and complete disregard for society and it's advancement as portrayed through the media and various outlets simply make it a very unattractive career choice to me.
Now this is not meant as a judgement of a whole industry and it's participants, it's simply my naive and uninformed opinion. I would like to read some comments about insiders who might have had my initial aversion, but came to see something different which is positive in some way. Any HFT devs out there with a similar experience?
provision of liquidity. If HFT didn't exist, you would still be able to buy and sell financial instruments, but it would be less convenient. Maybe you go to buy something and the offer doesn't appear until 10 minutes later. With HFT it's always available. Imagine your local supermarket. What do you need it for ? You could drive straight to distributor's warehouse and buy food directly from the distributor or the farmer who grew it. The local supermarket is a middleman which provides convenience of execution. Same with HFT.
Bad analogy. HFT is not like a supermarket. HFT would rather be the guys at the local supermarket running behind you and right in the moment you're about to grab something off the shelves they grab it buy it and put it back on the shelves so you can buy it for a higher price only. And they'd be allowed to do so because they own parts of the supermarket. Oh those guys don't exist at supermarkets? Well funny...
Many of these firms are market makers. That’s a role that has existed in markets well into the 19th century. Market makers are contractually obliged to offer a price for any security on their books. That means if you want to sell, you always can no matter what is happening in the market (unless it’s suspended). Nowadays all market makers (that I know of) trade algorithmically and given that volumes and speeds at which modern markets operate, it’s pretty much the only way they could without having to charge considerably higher market maker fees, increasing costs for everybody.
Furthermore, if I want to sell a security and one of these companies puts in the best bid, then I just made more money than I otherwise would have if they weren’t there. Likewise if I was buying, they might offer me a lower price. Isn’t that a good thing? They are happy and the counterparty is happy.
Then there’s arbitrage. If I am trading on an exchange, but there’s a better price on another axchange one of these firms trades on, they might offer me a better price than I otherwise get. Effectively I d get a price on the other market, without having the costs of trading on it. Again, that’s better for me, where ‘me’ might be your savings account or pension fund manager, or a manufacturer or producer on a commodities market. So these firms reduce costs for other participants on the market.
1. All trading strategies profit by identifying market inefficiencies.
2. Market inefficiencies occur (in general) when the price of a security is inaccurate.
3. By correcting the inefficiency, the strategy profits consistently until there is no longer a counterparty willing to take the trade at that (inefficient) price.
They would argue that they get paid to capture market inefficiencies and make the market efficient so goods, services and companies are properly priced.
Tsunami hits Japan and some Japanese computer memory factories are shut down. As soon as this news hits America, the prices of computer memory chips is expected to go up. But there is a delay in how quickly different people hear the news. Some American memory chip sellers might still be selling at the old (lower) price, while some of their better-informed customers seize the opportunity to buy before the prices go up. In this instance, the sooner the news reach America, the American resellers know to increase their prices, and they get the benefit from the increased prices. (And some buyers end up paying more.)
Later, the Japanese have rebuilt and the factories are about to start producing again. Now memory chip prices are expected to go back down. The sooner the new information spreads around, the more people will benefit from buying at the cheaper prices. (And some resellers will disbenefit by having to decrease their prices sooner.)
The well-informed market makers keep the prices as current as possible.
Buyers benefit from new, cheaper prices, but sellers disbenefit. Or the other way round. But we believe that efficient markets are beneficial to society, so the benefit for whoever gains by buying/selling at the most recent price information is larger than whoever lost because they couldn't sell/buy at the old price.
The benefit to me personally is that I can buy something like SPY for my retirement savings and not worry that I'm getting ripped off on the price. I just picked SPY as an arbitrary example and I'm not making any investment recommendations in this comment.
The benefit to society is that less money and human capital is wasted on finance so more money and human capital can be allocated to more interesting things.
HFT is a tiny industry that generates a tiny amount of money and they've largely replaced what used to be a much larger industry that generated a lot more money. That sounds like a huge benefit to me.
People have a misconception of advertising as a noble or worthy, and necessary thing. In business school it’s taught that it’s purpose is to inform rational consumers about what to buy. Actually if you look at advertisements, you learn it’s purpose is to misinform irrational consumers into buying. Advertisements have learned to use emotional messages, sex, repetition and other irrational means to sell.
Quite. If a person needs a thing they will realise that themselves and go out and buy one. Advertising is 100% about things you don't need and have no real reason to purchase.
If there is money to be made, it means that some companies or products were not priced correctly. This means that someone is getting overpaid and someone underpaid relative to their value. Traders find the fairest price for a product, just like a trader in a traditional physical marketplace.
Even if you think the machinery of trading is a negative (I don’t) that’s no argument against firms like this. If anything it’s an argument for them as these firms do the same thing as their predecessors at lower margins and with fewer people.
The empty trading pits in the South Loop Chicago and all those new condos downtown NYC are a testament to that.
The author of this article has also summarized most of the basics in a mini-book called "Max Dama On Automated Trading" which can be found online. Highly recommended.
The author says that he is giving a high level overview of the work done in quantitative trading firms, and then mentions that - "things like co-location, direct connection to the exchange without going through an API, using a high-performance language like C++ for production (never Python, R, Matlab, etc), Linux configuration (processor affinity, NUMA, etc), clock synchronization, etc are taken for granted."
I'm surprised to see that engineers are required to know hard core VLSI stuff like processors and physical affinity to cores, Non Uniform Memory Access, clock synchronization et al. Thought only Integrated Circuit Digital Design/ Chip design engineers are required to know that.
Would anyone kindly shed more light on this subject? Has anyone used VLSI / Digital Design topics as part of his Quant trading / HFT work?
I never thought of processor affinity as a VLSI topic. I mean you can muck about with it just using your OS, right? Also I did EE in college and it seemed a lot more low-level than that. Semiconductor principles, digital filters, JK Flip-flops, that kind of thing. I guess you end up meeting NUMA and affinity somewhere along the way.
The point the guy is making is that when you're coding this type of thing you need to understand how the computer actually calculates things, rather than just have a vague idea of a machine like you might if you code python. In python and those types of languages a lot of the inner workings are abstracted away. You can sort of just imagine an Oompa Loompa (a thread) reading the code, seeing the instructions (x = a + 2 if y==3 else a - 2 ) and going out to find the value of a in a box and putting in the value of x in another box. No idea where a and x boxes are, how close they are to each other. No idea about how that branching instruction works either, he just checks at the time and decides.
By contrast when you're writing for performance you want stuff to be in cache, so you need to have an idea of a machine that includes cache. Specifically the idea that the further away you are from the registers, the longer things take, and that some cache is shared while some isn't. You also want to think about a machine that can speculate and do branch prediction, so you'll need to think about how to write the code so the branch predictor mostly gets it right. I think the top stackoverflow answer is about that.
Affinity and NUMA are talking to this kind of machine (which is still abstract of course) which has a bit more detail than the Oompa Loompa.
You are probably better off learning more about what Martin Thompson calls 'mechanical sympathy'. Check out the LMAX/Disruptor talks. There are some blog around high performance/low latency computing.
This is not really about VLSI style hardware knowledge. The 'processor affinity and NUMA' config is all about minimising the cost of memory access, and maintaining cache consistency. They want to make sure that processes aren't continually being moved to different cores or accessing memory that is further (slower) away, to minimise latency. The 'clock synchronisation' talked about is probably not at the CPU clock level, but rather the wall-clock time, using GPS receivers or atomic clocks and NTP servers to ensure all involved servers are synchronised.
One of the main reasons here is minimizing the number of (FPGA) cycles it takes you from reading the first byte of the incoming UDP packet to writing the first (or last; depending on exchange gateways configuration) byte of the order that contains your reaction to the market data. If you're also doing computations "inline" (as opposed to what the author described in the article), it's even more complicated since you now have to build the books and calculate various things before committing to a decision. This may affect the general design of the strategy / execution strategy, so quants/engineers involved are often assumed to have the knowledge of how things work, down to the cycle level.
I took a tour of a high frequency trading firm once and they had a decent-sized division dedicated to their hardware customization for the machines they would place directly into the exchanges, which had chips they programmed directly. And of course it was all C++. Most of the staff who worked there were very specialized in low-latency C++ even if they didn't work directly in the hardware division as well.
I would be shocked to hear that they fabricated a chip. The cost and loss of flexibility vs an FPGA makes it unlikely. Very little is so static that the power and latency savings pays for itself. I wouldn't be surprised if these firms fabricated their own crypto ASICs though.
That's a good point. Since speed is one of the driving force in this business, software level cryptography won't be able to keep pace with such high speed transactions. Security has to be provided at the hardware level.
Knowing this level of detail was critical for the analytics and database development I've participated in.
Any time an algorithm must consult a dataset, and that data is changing rapidly, leveraging knowledge of the underlying system will give you an advantage.
I have heard from someone who develops S/W for one of these firms that they are concerned about instruction timing and have uncovered situations where some instructions occasionally take longer to execute. They go back to the manufacturer to find out why. They remain a couple generations behind the latest processors until they can qualify the newer ones for all of these details.
Actually I think that document is a far better overview, it actually outlines some statistical techniques (that are still very useful AFAIK) and doesn't focus exclusively on HFT.
Was aware of arbitrage and market making as a quant strategy but had never heard of market taking as an outsider. Thanks for sharing, helps give a little better picture of how the market works.
Having been in this space, I agree with most of the article. Good summary.
The one thing I would say is that it's very hard to make money doing this. The profits accrue to the select few who have cornered the market in terms of technology and speed.
Which is why firms who make money doing this usually have an edge beyond the technology. In one word: flow.
They are able to interact with a stream of orders... Not theirs... That is low in alpha and they are able effectively to get "first look".
I'd say your probability of success is going to be low. Success is not solely defined by PnL. If you're making money then you will certainly have to deal with increased compliance and oversight from whoever is sponsoring your access to the markets - this comes with a cost.
Furthermore you can never just "sit" on a strategy. Eventually your edge gets arbed out and you need to be creative enough to data-mine the heck out of the market to find some new edge.
Some thoughts on figuring out your edge:
1. You're not the only smart person looking at something. There are armies of PhDs who have been there and done that.
2. Study regulation and regulatory changes. Keep abreast of new & upcoming changes.
3. Look for markets where barrier to entry is higher or not as studied. eg: Electricity markets are a _bit_ less understood or traded.
4. Always think in pairs or more... Longing or shorting something outright is almost never the answer. But trading the spread between two (3,4,many) related products is more difficult to calculate and backtest.
5. Always do the more difficult thing - but don't make it complicated. Simple math is usually what works.
Thanks for your explanation, I don't have illusions that it's going to be easy. I rather think the opposite. For now I'm trying to get a basic understanding of trading and finance. This is just a hobby along with my programming job.
From your words I understand that I have got to be very creative in order to find an edge. Especially your advice on thinking in pairs or more is valuable.
Never knew FPGA programming was so easy in quantitative trading. Would anyone kindly shed more light on use of FPGAs in HFT & Quantitative Trading? Any web links or PDFs?
Not sure how helpful this is but http://algo-logic.com/node/1 Is a vendor of NICs that have FPGAs built in to do programmable algo trading VERY quickly. A friend used to work there.
FPGA programming is easy when your problem space uses event-driven state machines even on sequential CPUs as is the case for many things besides HFT, but it is not always so obvious.
Is there an error in this? "In this case book pressure is simply (99.0010 + 98.755)/(10+5) = 98.9167." in the Market Structure signals section. I can't see how that equation relates to the trade data given just prior.
> Let two players each have a finite number of pennies (say, n_1 for player one and n_2 for player two). Now, flip one of the pennies (from either player), with each player having 50% probability of winning, and transfer a penny from the loser to the winner. Now repeat the process until one player has all the pennies.
> The chances P_1 and P_2 that players one and two, respectively, will be rendered penniless are
> P_1 = n_2 / (n_1 + n_2)
> P_2 = n_1 / (n_1 + n_2)
So if the current order book looks like:
- Sell 5 for $99.00 (best offer)
- Buy 10 for $98.75 (best bid)
And you chomp up orders randomly, flipping a completed order to the opposite position:
- the probability that the price is above $99.00 (all 5 sell got filled first) is 10 / (5 + 10)
- the probability that the price is below $98.75 (all 10 buy got filled first) is 5 / (5 + 10)
So expected value is (99.00 * 10 + 98.75 * 5) / (10 + 5) = 98.9167.
I am not happy with this explanation. It seems like after you execute an buy for $98.75 you need to immediately put a sell for it at slightly above $98.75 (and vice versa) to fit the random walk model described above. And then I took the expected value using the price at two different points in time. Overall I am still confused myself.
Finding tick by tick data is easy. Every major lit exchange sells it for a few hundred to a few thousand dollars per year, either directly or indirectly (through a data vendor). Just search for "market data price list" and the name of your preferred exchange.
That's still quite a bit of money. If you want to train some good models you're going to need a lot more than 1 year and you're going to need many more than 1 market.
Some exchanges post sales data available for free. If you are relying on the full market data stream, you are talking about a considerable amount of data. In some cases, terabytes per week. It may be less costly to buy the data than connect and record directly, especially for a proof of concept.
It's very difficult to touch the market independently. The amount of infrastructure that is needed for algorithmic trading is more than any single person could reasonably handle. Not to mention the markets are incredibly efficient already.
The overhead of developing a prediction algorithm is fairly low from what I understand. Implementing it into the existing trading market is impossible, I agree, but developing something that can predict stock changes does not sound out of reach.
It's likely a fun project and quant trading seems like an interesting job. If you include on your resume your machine learning quant algorithm, your white paper, how you've extend one market's train data to say altcoins/penny stocks, I'm pretty sure you'll peak an interviewers ear.
Very interesting article, but heavily focused on high frequency trading...
I am really involved in low frequency trading with as I call them stock trading robots and have been doing this for many years with double digits return on average and all is done with python and few VPS around the world for redundancy e.g. very low brier of entry...
Nope, it is not day trading... I think you cannot win against the HFT shops, unless you spend millions on co-location and the whole arms race... No resources, everything is based on my own experience...
Frankly I won't... I am not selling or promoting anything, I don't need clients or investors, I am not promoting myself and I am not looking for a new job, quite the opposite, I am actually looking for an early retirement, (but I just can't leave the team I am leading behind, so I am postponing very year...)
One of the things that interests me is just how much, if anything, the development of these systems can be applied elsewhere outside of finance.
Now I get some attempts to financialise other industries (usually having large social costs) but just wondered if we are seeing this creep out elsewhere (online advert auctions perhaps?)
This alludes that all quantitative trading is high frequency trading. I would think all high frequency trading is quantitative trading but converse is not always true.
Because then one of the most successful quant funds like twosigma and rentec are not doing quant trading since hft is only small portion of their business. You cannot manage 50B in an hft strategy single handedly. The author lives in a different dimension, but my dimension could be wrong!
HFT is best defined as predicting movements under a tick; everything above a tick is out of HFT territory. Latency requirements could still be pretty serious though, as many models at many firms can act simultaneously for longer frequencies as well.
Indeed. One of my favorite finance talks is "A Breif and Biased Survey of Quantitative Investing" by AQR's Cliff Asness, which talks about what quant funds actually are. Link for those curious:
That would be the case for Market Taking and Arbitrage.
They help with determining an accurate price (Market Taking) and with keeping prices equal on all exchanges (Arbitrage).
For Market Making the goal is not to form consensus on the state of the market. Instead, Market Making is about creating a liquid market. That is, making sure that when someone wants to buy or sell, there is a willing counterparty at a non-stupid price.
Alternatively, you could describe them as 'keeping the spread low'. That is, keeping the difference between the lowest buy order and highest sell order low.