Hacker News new | past | comments | ask | show | jobs | submit | abkumar's comments login

Pages 9-11 of this pdf [1] have the full text of the article

[1] https://transition.fcc.gov/transaction/aol-tw/exparte/disney...


> this outage probably cost them just over a million bucks

I have always wondered whether it is accurate to think about it like this. It sure is interesting to know how much companies make in revenue per minute, but it doesn't necessarily 100% translate to lost revenue in an outage.

For example, if I wanted to purchase something on Amazon, but the retail site went down I would not visit another site to purchase it. I would just wait until the site came back up. In that sense, the revenue isn't lost just delayed.


That works with e-commerce, but not necessarily with advertising. If you were going to browse Facebook during lunch in the office (and be shown ads), but Facebook is down, then you won't necessarily just "browse Facebook later in the day instead." You weren't on Facebook to catch up; you were on Facebook because you had nothing better to do.


This is definitely true, I expect that users will spend a bit less time on Facebook today.

However, Facebook doesn't have a 100% fill rate of high value ads, so during the rest of the day they'll be showing somewhat more expensive ads to users, earning a higher average revenue. That will make up for some of the loss, despite lower usage.


For e-commerce that argument makes sense, but not for advertising.

They lost 30 minutes of primetime eyeballs from tens or hundreds of millions of people that would have seen ads but didn't.


I wonder when this picture was taken. The iPhone design in the simulator, the old Gmail favicon, the old OS X scrollbars all indicate this being several years old.

Wouldn't be surprised if this code has been rewritten multiple times since.


Let's see - the Simulator is running, at the latest, iOS 6. As such it was probably taken before iOS 7 GM (September 18, 2013). Based on the scroll tracks on Xcode, it also looks like it was running Snow Leopard; Lion was released in Mid-2011, so maybe even earlier.

Between iOS and macOS, and given the acquisition was in 2012, I'd imagine these photos were taken around then.


Their code is using manual reference counting so it must have been taken before ARC was introduced on WWDC 2011.


Not necessarily; for the same reason many companies haven't jumped on Swift (specifically, not wanting to run all your hundreds of thousands of lines of code through a potentially buggy translator all the time), people didn't immediately jump onto ARC either. If I recall, that migration was far from painless. I remember some serious regressions introduced when I bit that bullet. Since all your code changed at once, tracking it down was hardly entertaining. IMO could have been anywhere from 2012 to 2013...

This code also appears not to be using automatically synthesized properties. On the left screen, they're directly accessing Ivars and their Ivars have trailing underscores rather than leading (the default when synthesized). I suppose they could be doing this anyways, though it wouldn't exactly be standard industry practice.

On the right Xcode screen, though, the code does use property syntax, so potentially ARC, although one doesn't imply the other.


ASIHTTPRequest? Yep, this is ancient.


not to mention the hipstamatic'esq picture


I imagine they must do this to minimize risk for certain accounts based on factors unknown to me. I don't know what your error message said exactly, but perhaps the restaurant themselves placed an upper limit on delivery orders because they don't like that they are missing out on the tip for large orders. This is purely speculation by me of course.

Some of the fancier places will easily be over $25 of food so if you order for two, $50 is not difficult to reach.

Out of curiosity had you ordered from DoorDash multiple times before you attempted that group order?


I did order before. I thought it may have been some kind of fraud detection but I’d ordered from them multiple times to the same address. In fact, one of the first times I ordered they double billed me. Took a week or so to straighten that out with support. But even if it was some kind of fraud prevention flag it would have been nice to get stopped at that amount while ordering.


Pricing [1] since the anandtech live blog doesn't list it.

* Quadro RTX 8000 with 48GB memory: $10,000 estimated street price

* Quadro RTX 6000 with 24GB memory: $6,300 ESP

* Quadro RTX 5000 with 16GB memory: $2,300 ESP

[1] https://nvidianews.nvidia.com/news/nvidia-unveils-quadro-rtx...


I know meta comments are unwelcome but "nvidianews.nvidia.com/news/" is so egregious that it caught my eye.

It looks like their entire website is a huge legacy mess.

"news.nvidia.com" leads nowhere.

"nvidia.com/news/" leads to "nvidia.com/content/redirects/news.asp" and produces "Corrupted Content Error"

The page you are trying to view cannot be shown because an error in the data transmission was detected.


I suspect probably something like: spending money on website appears to do nothing to increase sales.


Alas, there's nothing at https://nvidianews.nvidia.com/news/new.


That's kind of amazing... also impressed you took the time to determine this.


I understand the technical novelty, but from a financial perspective for the NVIDIA company and the customers and data centers they pop these things in: is this a big deal?

Is this just routine new cycle stuff that amazon and microsoft are going to buy en masse like they already would have no matter what Nvidia came out with?

Or is this a big deal for reasons that you will explain and the stock market is going to go wild attempting to "price this in"?


> Quadro RTX 5000 with 16GB memory: $2,300 ESP*

For reference, I purchased my first 3D card from Evan's and Sutherland to run Softimage with 32MB Ram in ~1996 for $1,800.


I found an old Computer Shopper from the late 1990s and had forgotten how ridiculously expensive computer equipment was - the real sub-$1000 market wasn't even a thing for PCs until 1997, and the range between a sub-$1000 computer and an expensive one was astounding even for day-to-day tasks.


Prices back then were really high...

I won an AT&T Safari NSX/20 laptop [1] in 1992 in the ACM Programming Competition. RRP was $5749 then, for a 386SX processor running at 20MHz, 4MB RAM and a monochrome screen. $10,200 in today's money. It was actually a beautifully made machine.

A year later, I switched to a Dell with 386DX and a 387 math coprocessor because my PhD needed the number crunching. That cost twice as much (i.e. around $20k in today's money), paid by the military lab sponsoring my research.

In our current times of cheap compute, it is easy to forget how much top-end computers cost 25-30 years ago.

[1] https://books.google.co.uk/books?id=AoKUhNoOys4C&lpg=PP142&o...


>the real sub-$1000 market wasn't even a thing for PCs until 1997

YES! I was at intel when they were doing the initial research to even see if a <$1,000 machine was feasible. With the celeron... and this is when they were literally paying millions to companies to optimize to the intel processors so they had software that would be subjectively digestible by the market to purchase software and machines thinking that they were getting compute power for their buck.

/lawn.


My 386SX 20 MHz, 2MB RAM, 40 MB HDD bought in 1991, was about 1 500 euros in today's money.


i haven't done the math, but the 1996 $1800 seems like more money than 2018 $2300.


Apparently, 1996 $1800 = 2018 $2891...

But the amount of actual capability one gets from one of these cards is thousands of times more than what one could do in 1996.


$2,891


$40,000+ for Indigo2, also running Softimage! Daylight robbery.


1991: SGI 4D/340 with VGX gfx, 4x MIPS R3000, 64MB RAM, 700MB disk: $180,000.

I sometimes laugh to myself when people complain about the price of GPUs. Yes, $10,000 is a lot, but in historical context, it's pretty reasonable for top of the line technology.


Beyond graphics, I think of all the "terascale" talk in the high-performance computing world when I was in school. Now your consumer GPU does multiple Tflops instead of hoping a supercomputer to maybe possibly reach 1 Tflop some day for a few lucky users.

The same thing has happened with RAM and storage. My first Linux PC had 20 MB of RAM and 80 MB of disk and was sufficient to do most of my CS projects at university in the early-mid 1990s. Now, a sub $200 smartphone has over 100x the space, while desktops are commonly 2000x.

The research side not only moved on 1000x to "petascale" but that's now boring and there is real talk of "exascale" with the same gleam in the eye. One million times the performance we dreamed of at the beginning of my career, though I think this is partly by expanding the scope of one machine to larger and larger distributed systems as well as scaling up the capacity of individual elements.


As a man from history, I concur. It's amazing what we're having today at those prices... but, they could go down too!


Inflation-adjusted $1800 in 1996 is ~$2850 in 2017.


$10,000? They'll have people buy the whole lot anyways. Buy NVDA


Yes, they will. Not even counting cryptomining, universities and big-businesses use NVidia-powered supercomputers with 5000+ $10k cards in them, for AI deep learning or for astrophysical modelling or other things.

Previously those cards have been the ~$8k-$10k P100 or V100. Not any more.


Interestingly, I game on a P1something, through AWS and parsecgaming.com — very cool tech, and actually worthwhile for me price wise as I don’t game often enough to invest a couple grand in a gaming PC.


AWS and GCE as well, they can put them in their datacenters and run them 24/7, with their customers either renting it for a few minutes to hours for the odd task, researchers for hours / days to run a huge task, and the spot market making sure they never go unused. For them it's more of a matter of how much they're used in terms of % of time, and how much they can charge per compute hour. They can earn the investment back within a year.


rtx 5000 is actually not too expensive as comparing to the launch of Volta series.


Reading this spiked my interest in graphs especially considering I never learned anything more than the basics taught in algorithms and data structures courses in college.

Any resources (i.e. textbooks) recommended for learning more about graph theory as well as currency arbitrage?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: