Hacker News new | past | comments | ask | show | jobs | submit login
Inside Equinix's NY4 data center where Wall Street trades (bloomberg.com)
79 points by dodders on April 18, 2016 | hide | past | favorite | 34 comments



I've been to NY4! It really is less magical then these articles make it seem. Yes - the security is a bit imposing. For anyone familiar with datacenters, though, it is pretty run of the mill except for it's tremendous scale.

Worth shouting out to Lucera - I have met the CEO and he is a really thoughtful technologist. They are doing some awesome work bringing AWS-style computing to the low latency space. Some interesting details on their deployment: http://www.enterprisetech.com/2014/02/20/amazon-cant-catch-l...


If high-frequency trading were limited or abolished, would Lucera's advantage over AWS disappear?


Not really. Even in a world where HFT is limited, cross connections provide consistent connections between trading entities. While AWS provides over-the-Internet or "black boxed" networking on a best efforts basis, Lucera's connections are point to point.


If you are at all into this sort of thing( data centers) and you ever get a chance to go, I strongly recommend it.

The amount of tech in this building is just staggering. As the article hints at, everyone who is everyone in finance has a spot there. This has an effect of the more people who are colocated there, the bigger the draw it becomes for the remaining few who aren't currently colocated there.

In the interconnected world of finance, data center cross connects become very important. Its not unusual for a prop trading firm to have 30 different cross connects to vendors, exchanges and sell side firms.

So I guess its not surprising that there have been companies formed with the sole intention of connecting companies that already exist in teh data center. Radianz and Lucera are mentioned in the article but there are usually a half a dozen in each of the main data centers in the new york and chicago surrounding area. as usual Nanex does a decent job of describing the reason why there are so many data centers and they all are atleast 10 miles apart.

http://www.nanex.net/aqck2/3532.html

Another article from today that talks about the new data center.

https://medium.com/@RobinWigg/wave-a-final-goodbye-to-this-5...


The Medium piece is interesting, still not sure where I stand on HFT, the technology is endlessly fascinating I'm just not sure about the ends to which it is put.


How does one go about visiting such a facility? Do you have to be a customer or can anyone just request a tour?


I visited a Equinix data center in San Jose and it was an incredible experience. The redundancies and failsafes they have in place are mind blowing. No two doors to the same room can be open at the same time. All doors use handprint readers.

They have a rack of big batteries that fill up a huge room that go on automatically in case of power failure. This will last about an hour until a 10000 HP generator on the roof starts and powers the data center for a week.


Typically the operators will have an agreement so that they can get the generators refueled should the outage last even longer than what fuel they have on premise, so they theoretically could run indefinitely on generator power.


An outage requiring generator power for longer than 1 week might not allow transportation of fuel. Just want to bring that up in case people really did think you can run "indefinitely" on generator power if the roads, bridges, and gas supply network is out, or if there's an emergency state of war or terrorism.

At the 1wk+ territory, it will start to look something like Katrina, where the roads were all blocked by flooding waters and fuel resupply missions required the protection of the National Guard.

I mean you can charter helicopters to bring in fuel, but I think there are more urgent things to power than a datacentre in those cases, for example, hospitals and shelters for emergency humanitarian aid.


> At the 1wk+ territory, it will start to look something like Katrina ...

This reminds me of a blog that was live-updated during Katrina by a data center operator: [1]. They managed to keep everything running through the whole storm, IIRC, but it was pretty dicey -- moving barrels of diesel by hand, staying on-site and surviving with water rationing, losing all but one upstream fiber, etc. Fascinating reading in hindsight.

[1] https://en.wikipedia.org/wiki/Interdictor_(blog)


Where I used to work we had refuelling agreements with multiple vendors. Yes, hospitals and critical infrastructure are prioritized first, but we ran on generators for an extended spell during the Northeast Blackout of 2003[1] (the outage affecting our facility only lasted a couple of days, but due to the utilities' appeal to reduce load on the grid until it was fully functional we decided to run on generators for over a week).

Edit: One of my fond memories of that time was actually greeting the refuelling truck and directing it to the point where he could plug a hose into a pipe on our building and start pumping.

[1] https://en.wikipedia.org/wiki/Northeast_blackout_of_2003


If terrorism, what stops someone from renting cabinets, bringing in "servers" loaded with explosives, and blowing the place up? Or even a molotov thrown at some of those big fibre entrances. Sorting that out, I'd imagine, would take a while.


Never underestimate what people would do to keep the stock market running...


Or do! They closed the markets in the US for two days during Sandy even though NY4 (and the other US equity exchange data centers) were fine.

http://www.wsj.com/articles/SB100014240529702047893045780871...


Sometimes it feels like the opposite. Any day that has the slightest hint of holiday to it is "markets closed! remember no work tomorrow! hooray for national silly lala day!"


These days, I'd hope they can failover to standby hosts in "IL#" or "VA#" or another datacenter in a nearby region.

The DR/BC space has become pretty robust, and with the right vendor, a datacenter loss can be recovered from simply by powering on fully-replicated VMs in another datacenter.


This was an issue after katrina


Heh - I remember back in '01 or '02, being let into the PAIX DC and left alone for around 90mins with a backpack full of hard drives and a toolkit.

It was the cheapest/fastest way for us to get our "big" database from Sydney to the US - hop on a plane with about 9kg of hard drives as carry-on. Never underestimate the bandwidth of a 747 full of 340 Megabyte hard disks ;-)


Equinix loves the flash interior decorating but it doesn't do anything for your uptime or latency. Datacenters really aren't that interesting, maybe the "meet me room" have some neat looking network gear but by and large DC are visually monotonous. Question - aside from having a high concentration of wall street firms colocated there what is interesting about this? What innovations are they making?

The funny thing about all the security - the man traps, the palm reader, the alarms systems there is still usually some security guard at the front desk who is often napping. Not that I blame them but its kind of laughable.


What is it with these new auto-play video distractions lately on bloomberg.com? Does that really improve time spent on the site?


Equinix is really a top tier provider. I have some space down in San Jose SV5 and it's pretty awesome. If anyone is interested in a colo swap let me know. I have a few units free and would love to barter it for some space maybe a bit closer to SF.

Also just for kicks I requested a quote for the SV5 center a few months back. They are roughly 3.2k / month for a full rack [1]:

[1] http://i.imgur.com/ZPDm3gi.png


What makes them a "top tier provider"? They provide power and cooling and rack real estate. They don't have a backbone. There's plenty of carrier neutral facilities that provide the same for cheaper. They all provide similar SLAs these days.


That seems like a bargain for the level of assurance they provide. Two such locations with fail-over is median cost of a single, IT worker.


(The easiest way to cause worldwide financial panic is to bring a GPS spoofer near NY4. Most of HFT operations synchronize time via GPS receivers).


Any GPS clock you are going to find in NY4 is going to be sufficiently sophisticated to check the GPS signal against its internal oscillator in order to check for validity. It can fall back to the internal oscillator if the GPS time looks bad.

The other reason this is unlikely to be a problem is that network time protocols break all the time (oops someone did a maintenance and broke sync with the master clock). Lots of real world trading code has sanity checks on the time.


This is why I mentioned a spoofer, not jammer :)


It's the same problem. Assume my GPS clock is synced to actual GPS. You turn on your spoofer and now my clock sees that the GPS source is running faster or slower than it should be so it falls back to the internal oscillator.


What if I am careful not to mess up and introduce only microsecond/millisecond level corrections? :)


Those GPS driven clocks would be marked as falsetickers and thrown out during the pruning process. I would hate to have my plan to create financial chaos blocked by the presence of one Symetricom box with a CDMA card and or off site clock associations.


Who needs a spoofer when you can make a jammer or emp? (don't do this, I have a friend of a friend story of a guy who did this in uni and got visited by men in black suits, the professor almost got fired)


What tremendous overhead Equinix is taking on with their facilities and clientele. The Bloomberg author mentions their annual filing[1] which discloses that they have a net income loss and 'substantial debt'.

[1] https://www.last10k.com/sec-filings/eqix


The numbers I saw were positive. It was over $100mil in 2015 with projected $10 in 2016. Revenues are huge. Im a bit tired so might have missed the negatives but looked good at a glance.


Together with LD4 in London and TY3 in Tokyo, I think, more than 50% of worlds trading activity goes through these 3 buildings.


Wow, I remember this project! I was involved in the Electrical commissioning for the first phase of the buildout. I Remember the 2 hour car rides at 5AM to get there. One night I had to stay up overnight doing a UPS load-test there, was cool walking around with nobody in sight, boring otherwise.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: