Everyone seems to overlook the point here. That yet again Amazon were slow as hell to be honest with their customers. I get it up down reports help but why do you keep using a service which lies to you about availability. I've read on HN in the past how the dashboard can only be updated to reflect an issue with approval. (Comments section on a similar posting, believe it if you wish).
So why not move to a hosting company that is transparent and open about their status. I'll not make suggestions as I don't want to be accused of trying to shill for a specific provider but there are plenty out there. 45 min to update their public dash is too slow. They either don't care, don't monitor or they are trying to hide their stats for fear Jeff will beat the staff for SLA violations.
If any other provider lied to customers the way AWS does they wouldn't be tolerated why do you tolerate this behaviour from AWS?
Good luck convincing your company "Hey because they were 45 minutes late in informing us we need to move all our cloud to a different provider."
Updating a dashboard can easily be an automated process but for business reasons it is not. AWS did not "lie" about the incident - they are extremely transparent for all outages and disruptions (btw this was a disruption - not an outage). They stated on the issue the exact time frame for when the issue started and when it ended.
Is it bad they were late? Definitely. AWS has a history of being late due to the sheer scale it works at. I've caused an outage myself when I used to work there and updating the dashboard requires several higher ups to understand what exactly the issue is and what is considered to be worthy of "informing of an incident." These processes take time. Is it perfect? Absolutely not. But there are legitimate reasons for it.
I'm not sure why you think Jeff is involved here. This kind of disruption isn't enough to warrant someone as high as Jeff to be involved.
As for SLA violations, AWS public SLAs for every service and they credit your account if it ever dips below those defined thresholds. And as for caring I don't know a single cloud provider with the level of great customer support AWS has. This is extremely opinionated but this is what I've observed in the industry.
I would recommend people to use AWS monitoring. But having some of your own basic internal dashboards / metrics is also worth having.
Use a monitoring service to monitor the provider of the monitoring service? Wouldn't it be better to use a monitoring service hosted on a totally different provider?
I'm not even sure running your own monitoring is sufficient in this case. Sure it's useful to have, but when something goes wrong, the first thing I want to know is if it's us or them. If it's us, I/the team scramble to fix it in a panicked frenzy. If it's them (the cloud provider), and they acknowledge it early, even a simple "we're investigating an issue with X", we can at least take some comfort from the fact that it's out of our hands.
If we just don't know the cause, we assume it's us and jump into panicked frenzy mode. Panicked frenzy days are the worst days of my life, especially if it's discovered that it was all in vain.
I understand the frustration, but Im not convinced monitoring at large scale is that straightforward.
The core question is: what constitutes degraded service? Would you say a service is experiencing downtime every time a 500 response is served? If you're serving millions to billions of requests/sec it seems a bit disproportionate to marka service down after a single 500 error, so then you need to work out some kind of acceptable threshold.
What about latency? Again you're just going to draw a line in the sand somewhere.
You end up with this big mix of metrics that define service quality, so you then have a kind of meta problem of deciding which metrics you should alert users on. Get too trigger happy and it's going to cost you money and customer trust, and your customers are going to get alert fatigue when it turns out the issue you alerted them about was more of a false alarm. Set the bar too high and you'll have angry customers wondering wtf is going on.
All that to say I don't think there's a right answer.
We were pretty liberal with posting to our status. page for years and thought it was The Right Thing to do. I still do, to a point.
But, what ended up happening was a competitor who didn't have a status page at all would use our status page against us in the sales process. They just never mentioned their lack of a status page to compare to.
This was the same competitor who went 100% down for ~4 days during the busiest month of the year and only posted updates to a private Facebook group. There was data loss that was never publicly admitted to.
So, yeah, we implemented reasonable boundaries on what constitutes a post to the status page. We also adopted a new status page provider that let us get more granular with categorizing posts, and allowing users to subscribe to only "urgent" channels that pertain to them.
Before 2003-ish Amazon used to have a static "gonefishing" page on www.amazon.com that was manually triggered during outages. Due to newspaper reporters writing scripts that would detect the GF pages they were removed and the site was allowed to just spew 500s for whatever segment of critical pages was busted.
Very fair but 45 min of an outage/disruption before manually updating public status is poor service and why is that acceptable for Aws to deliver to users
Also good luck trying to convince your company to migrate to another cloud provider over, say, implementing multi-region strategy, which you should have been doing in the first place.
Highlight the lack of transparency on reporting outages and that's a start. If your MPLS or ISP provider operated in the save way. The company wouldn't accept it
My company is not going to spend hundreds of thousands of dollars or more, and months or even years of effort, and add additional constraints to the given pool of candidates we are hiring for, to migrate to GCP or Azure or DigitalOcean or Hetzner or wherever is considered more trendy than AWS right now due to "a lack of transparency" lmao. I would look completely incompetent to even suggest the idea to anyone internally.
But your company is willing to accept poor service and as a result spend more money with the same provider to ensure continuity. So essentially you reward Aws hiding their stats. As they can claim high uptime figures and when an outage happens it's the users fault for not spending enough money with them to have many many instances around the availability zones to ensure your covered the Aws mess up. I get it redundancy is needed in systems but lack of proper reporting message users are forced to over spend our of fear. It's a great business model. Hook the clients in with lies and then get them to reward you for hiding facts. Clearly your company has money to burn wasting it like this. Every one knows they lie and are blatant about it why is it tolerated. As I said I don't see other enterprise providers getting away with this kinda behaviour towards clients
If you are willing to host your critical infra on some dodgy startup alternative that might go away in 3 months because you refuse to bend on your personal values and separate them from what the typical organization actually cares about, best of luck. I know HN tends to loves the underdog, but there is a time and place for that, and a time and place to accept what you need to do to keep your services online.
So your logic is to accept poor quality service to keep your service online rather than trying to do better and improve service. So you are saying that rather than rewarding a company trying to do better just accept poor service from Aws.How is this better than "hosting on some dodgy start-up" This is nothing to do with my personal beliefs or opinion I'm trying to understand why it's accepted from Aws but not others
Edited for to add point
My logic is to build highly resilient infrastructure given the constraints available. Your definition of "poor service" is not what I have experienced in my 10 year career as a SRE, because I build around your definition of what makes it poor and make it work as it should. It's called chaos engineering, and companies like Netflix have been doing it for years with their Chaos Monkey tool and SRE practices. Doesn't matter what cloud provider you go to, there is ALWAYS unexpected and unannounced downtime unless you build around, plan for, and expect it. But sure, go ahead and tell us all how industry leaders like them are wrong for sticking with what you call "poor service."
Ok simply question. Would you accept any other infra service provider having such poor customer service and not provided updated status of an outage/disruption for 45 min.
You are deliberately avoiding the counter-points I already specifically addressed in response to that question to the point we are stuck in a loop, so I am going to leave this thread now. If you feel you can do a better job at SRE with your current mindset and believe you are better at choosing which cloud providers are worth using for an org, I welcome you to try.
Many companies are hiring and retaining specialists in AWS-lock-in-technology, who lack experience with another-cloud-provider-technology, so I don't know what's surprising.
Training and getting up to speed takes time and money, neither of which are unlimited for any organization. It's not that they/we can't work with other cloud services, it's that it would likely add up to months of additional on-boarding time to get someone who wasn't familiar with another cloud provider productive with infra at scale on said provider.
Didn't say you have to go "cloud" rent hardware in a DC and run that yourself. Or use a VPS I mean the cloud is just "Other people's hardware" and I'd thank you to not insult gorillas like that by comparing them to Amazon.
Colocation or especially server rental generally requires no persistent staffing. The datacenter has their own staff for tasks requiring physical intervention, and you have IPMI/iLO access to your servers for doing reboots and similar.
I'm sure there are 2bit vps providers that claim to be cloud and are terrible. But for the price and claims of service like Aws I donno they are at the scale where they don't have to care about customers
Man that's cool I never worked on the carbon arc machines.
I remember having a Xenon lamp weld itself into the holder which was also the cathode connection. All because someone didn't tighten a grub screw which ment that the gap between the thread on the screw in end welded themselves to the nut mount the lamp screwed into. Had to remove the mount just to break the lamp in a dodgy way just to be able to force the threaded end out.
Similar to that, I used to find it hard to watch a movie in a different cinema where the projectionist didn't pay too much attention to detail when framing.
Sure most of the picture was on screen but for like 30second more attention it could be great on screen
Well written and nicely put. I served my time as a projectionist also and miss the job to this day. And digital isn't the same, while they are "picture perfect" with great colours there is still a deadness that I can't explain. Not too long ago I was back in the old projection hall which is now digital and it just sounded wrong I missed the mechanical sound of the old machines.
Not everything digital is better.
Some of it is nostalgia sure you say film degrades after every run But to counter that argument every time a sound or image is made digital it degrades due to quantisation noise that is added. High bit rates help but it's just an artifact of the digital world
True and agree and but digital is still only ever a close approximation. The way the tech works is all it ever can be. It gets to the point where you can't notice but it's still a very good approximation
Very true. I can't stand the screen door effect you see on a lot of digital projections. It's horribly distracting. I saw Dune in digital IMAX and spent the whole time thinking there was something wrong with the projector. I think they need some kind of final analog optical step that mitigates the pixelated look many have.
Plus with digital you never have the film getting stuck and melting - always a treat to witness!
I was an usher and witnessed it once at the end of the credits. I radioed into box, as we called it, that the film was melting on screen. Next thing I hear feet pounding from the projection room and the light shone on the screen became dimmer and the burnt corona of the film disappeared from view.
Especially once everything was spliced onto one big reel, no not common. But when you were switching from one projector to the other every 15 minutes, there were more things to mess up and it could happen from time to time. More common was just a botched reel change of some sort in which the old reel ran out and the timing to switch to the new one was off for some reason or there was just a problem with the film threading.
Only time I had it happen in 7 years was when due to failing bulbs not striking so easy I had to strike em manually and left the manual switch in the on position.
Well there was a power brown out and that killed the basic automation on the machines that would close a dowser when the film stopped. So the film stopped but the dowser stayed open and the bulb burned through. And for good measure that exact same thing happened on 2 machines at the same time. I believe I was heard in a screen as I responded to the situation with something like "For F*k sake" while restarting the other screens before having to splice the 2 burned films.
To clarify these machines were platter fed so no changing reels. The dowser was used to light the bulb while the leader was still running through but not to be on-screen.
That's been bugging me since the start of the digital era. Slightly defocusing the projector should work, shouldn't it? Or perhaps someone could create some advanced optics to spread pixels out.
That's part of it for sure but even on a still scene 35mm projectors still had a small bit of movement because it was mechanical and alignments are never perfect. I know in the cinema I worked in even sat in the screen you could still hear the hum of the machines if you knew what to listen for. To me it was all part of the cinema experience.
Digital has improved compared to the early generations yes. But if you gave me the option I'd watch 35mm over digital any day
I totally understand this, but it's something like vinyl. Yeah, the vinyl doesn't have the same sound quality, but there's something about the little crackles and pops, that warm fuzzy sound in the background that gives me, well, a warm fuzzy feeling. Dropping the needle on Stairway to Heaven is just a different experience.
I choose that example in particular because LZ IV is one of the first albums I ever heard on vinyl. My dad had an old crate of records and a truly bitchin stereo setup that he let me dig out and set up in my room in my teens. He had told me the story of the first time he heard Stairway (in his friend's basement, on vinyl) and it just seemed like a fitting first run.
I also vividly remember the first time I heard The Yes Album, also on that stereo. The opening bars of Yours Is No Disgrace made an immediate impression on me and I had a burned CD of the album on repeat in my car for the next month.
This post is already way longer than I intended, but my dad's record collection was also how I realized that he had been a stoner as a young man. The various Pink Floyd, King Crimson, and Santana records should have been enough of a hint, but it was when I found his copy of Cheech and Chong's Big Bambu, and a conspicuously missing giant rolling paper, that I had the realization.
As an example, I recently saw Full Metal Jacket on 35mm. The last Boot Camp scene, the one in the bathroom, while unsettling in any context becomes positively haunting in 35mm. The lower light conditions introduce artifacting even in digital, but the slightly wobbly frame motion and the scratches and dust marks exaggerate that artifacting further. The resulting image is almost expressionistic, as though reality has become so horrifying our very perception of it is starting to breakdown. The same sort of effect recurs with the sniper at the end of the film. In between, the rougher, less pristine frames create a sense of grittiness that amplifies the mood of the film. In short, I thought seeing Full Metal Jacket on 35mm was a fully superior experience over watching it on digital (although I’ve only seen digital presentations of it at home and not in a theater).
You certainly can recreate something very similar in post, although that’s a bit like saying chiaroscuro lighting is achievable in charcoal, oil paint, and photography: the effect is still going to be distinct depending on the medium. Digital has a whole host of unique qualities and even some distinct advantages over film. But the image of film has a different quality to it as compared to digital: it simply looks different than digital projection or home video, even if you had a pristine, flawless film print. The difference in image is subtle, but your brain recognizes it the moment the projector starts: “oh, yeah, this is how movies used to look.” Just as photography hasn’t superseded all other 2d visual arts, I think it would be a mistake to treat film as unnecessary in the age of digital. I can certainly understand why a director and DP would choose to work with digital for the average case project these days though.
Things don't always have to be perfect. It's a personal preference.
While I appreciate the helpful suggestion the first part won't really work and but the second part is fair playing the sound at a low level could help with the experience.
What I mean is this is my memory of the cinema and what I enjoyed. Digital achieves "Perfection" but in a way which is lifeless for film. From the point of view of a projectionist every machine had it's own quirks when running a film. I used to run 7 machines in one cinema and once you got to know the tone of the place you would often hear something starting to go wrong before seeing it. A click that shouldn't be there or a platter sounding rough as you laced a film up. The work was interesting in ways that digital never will be. I also prefer some things in electronics to be analogue rather than digital. And no matter the bit rate always remember digital is only ever a close approximation of that colour or that shape it's never perfect. However it may get to a point where the eye can't see the difference and the ear can't hear the difference but it's still only an approximation. I get it I like old tech if that's a crime shoot me.
Edited to fix auto correct issues