The SEO spam exists in part because Google does a really bad job at surfacing any site that isn’t content heavy. This leads to sites needing to churn out keyword rich fluff content in order to get any organic traffic.
I have first hand experience with this. My niche, hyper local online service (of which everyone loves) was completely undiscoverable through Google despite the niche having nearly zero relevant results of any kind, and zero competition of any kind. It seems Google didn’t like the concise, no BS one-pager for my service.
My options were to continue to spend on Facebook ads perpetually or to create SEO spam content. Within literally a month or so of generating fluff articles I was gaining a ton of organic traffic and could stop ad spending altogether.
A good example of larger company doing this is Digital Ocean. Fortunately they’ve been extremely ethical about it and the content is genuinely very good and helpful.
By hyper local I mean something relevant to a small geographic region like a city or state.
To be clear, that aspect was mostly tangential to my point.
I think the point is Google is really bad at understanding web apps, online services, or really any site that isn’t information rich.
The majority of web apps for example only really need a landing page and a sign up form. But if that’s all you have Google’s algorithm is going to show zero interest. And it seems that’s the case even if you put in the work and optimize the content and provide plenty of meta data.
Imagine you own a pizza joint, but the only way Google shows your site to any of your potential customers is if you dedicate to publishing an article about pizza every month. That’s basically the boat a lot of us are in with web apps, and other online services.
> I think the point is Google is really bad at understanding web apps, online services, or really any site that isn’t information rich.
You keep using that phrase, but it doesn't select for information richness, it selects for verbiage. This is at best orthogonal and more usually opposed to information richness.
Verbiage is the result of bad copy writing and/or lazy keyword stuffing. I don’t think they select for or reward verbiage, but for sure they’re not doing enough to treat it as a negative signal.
Well not directly, but you're heavily penalized for not keyword stuffing and then passing it through an AI tool until it is 'simple to read' (ie. says what you're trying to say extremely badly five times with almost-correct words to avoid anpiece of jargon).
As a result even the content made with earnest intent to communicate has to read exactly like blogspam in order to rank.
I have first hand experience with this. My niche, hyper local online service (of which everyone loves) was completely undiscoverable through Google despite the niche having nearly zero relevant results of any kind, and zero competition of any kind. It seems Google didn’t like the concise, no BS one-pager for my service.
My options were to continue to spend on Facebook ads perpetually or to create SEO spam content. Within literally a month or so of generating fluff articles I was gaining a ton of organic traffic and could stop ad spending altogether.
A good example of larger company doing this is Digital Ocean. Fortunately they’ve been extremely ethical about it and the content is genuinely very good and helpful.