"Trusted Advisor" on the AWS Console has been doing this for a long time, for several types of under-utilized, not only instances. Not only this but it also notifies you when you're reaching default limits so you can put in a support request for limit increase before you hit them.
You gotta pay 10% on top of your bill to have Trusted Advisor. (that's part of the pro support plan).
Last I checked there was no advice on instance size. I've seen one or two, nothing meaningful.
There are lots of advice for reserving instances though, and they are extremely poor. Listening to them would be a good way for us to waste $10k/month :D
The checks and recommendations aren't very sophisticated, though. E.G. the Low Utilization Amazon EC2 Instances check:
Checks the Amazon Elastic Compute Cloud (Amazon EC2) instances that were running at any time during the last 14 days and alerts you if the daily CPU utilization was 10% or less and network I/O was 5 MB or less on 4 or more days.
AWS I think has something similar, but I'm not sure it's the same, CloudWatch[0]. Cloudwatch giving alerting and can kill underutilized instances. But this sizing feature from GCP seems to allow for instance resizing rather than just deletion.
As far as why this may benefit google: while a customer will pay the same regardless, that means there are less resources in GCPs overall pool of potential machines that others can use. And to support the overall capacity of all their customers, Google needs more machines to handle them all. I'm guessing Google would rather have most of their machines active, as building out less racks/datacenters is probably better for the bottom-line overall.
Eventually google is going to disrupt the cloud industry, just like how they disrupted the Ad industry by having analytics and providing users with right price and catering to their exact needs.
Google started with a dominant search platform. They don't have that luxury with Cloud.
Google used to have a near monopoly on solving hard CS problems, with an army of very very very smart CS grads. They could produce software that no other company could.
These days, many other companies have invested in growing their army of top tier CS devs, and so they are able to make software equally as impressive.
While advantage Google has enjoyed is slimmer, I think it's a fallacy to think that technology has been commoditized.
Slight oversimplification, but Google's had basically S3/GCS and Hadoop since early 2000s, and basically Docker + Kubernetes for the past 10. This translates into real-world edge.
For example, how many big data tools have any one of the 15 characteristics of BigQuery that I described at [0]? Or how many companies have inter-data center networking that gives you a Petabit of bisectional bandwidth, described at [1], or how many cloud providers have live migration, custom VMs, or insane RAM-like local SSD? Or how many PaaS offerings can sustain a behemoth like Snapchat? Or a NoSQL that scales to 56 million qps with just a handful of engineers [2] ??
Edit: To add more color. In days past, Google would release papers (GFS, MapReduce, Bigtable, etc). With Google Cloud Platform we can just externalize these services (with a bit of customer-friendly bits like isolation, pricing, etc). Bigtable, BigQuery (Dremel), PubSub to name a few.
These are good points. Google does offer services that nobody else has. Some that are very easy to use, and competitively advantageous.
Except... Customer service for cloud platforms is terrible. Not only is there an image problem. The fact that there is almost a blog post about Google Cloud Service screwing over a customer and just flat not communicating happens every 2-3 months.
If you have an case ids, where you had terrible service with Google Cloud Platform Support, please send them to me (tsg@google.com) and I will investigate them.
I was (mostly still am) a big fan of Google BigQuery. Especially since GQL supported some Regex, plus native URL parsing. That is one of the coolest services that Google offers that no other service can match.
However, having used Big Query extensively at my startup, getting external data into it via Google Cloud Storage was a royal pain in my balls.
My broader point, is that Google was able to disrupt the Ad industry because it dominated web search by way of having a large number of exceptionally brilliant engineers. In 2016, that is no longer true--many other big (and small) players have invested in hiring their own equally brilliant engineers.
Sorry to hear. We have completely revamped our ingest mechanism lately. Feel free to ping me with specific problem you had.
From my experience, most issues with loading data stem from just data parsing, or trying to make batch load behave like stream load (we have an api for that too).
And, one of the things I mention in my article, unlike any other database, batch ingest into BigQuery is entirely free, as in not competing with query capacity one bit.
This makes me a little nervous. The linked page admits that sometimes there are good reasons for machines to look "underused", but how much luck am I going to have explaining that to a bean counter who thinks he's seen a way to save some money.
This is the standard banner for alpha/beta services. Some become permanently free once they go GA (i.e Cloud Shell) while others become paid (i.e Cloud Vision API)