When Google first came out and we tried it there were two things we liked about it.
The first: Good search results.
Yeah everyone knows that right. But young people today might have forgotten the second reason we liked Google search back then:
The Second: A clean page with a single logo and search field. It loaded quick. There were no banners everywhere, no bs.
HackerNews has that feel. It is clean, information dense, and does what it needs to do.
Google seems to have forgotten this and now thinks its some kind of design company. Everything with the exception of adwords and analytics it has applied material to is now garbage.
News use to be my default homepage but surprisingly MSN gives me a much better product with more then 4 stories on the page on 32" monitors.
I miss the Google of the “to organize the world's information and make it universally accessible and useful” era.
Google Plus could have been fine if it wasn't rammed down user's throats. But no, every product had to integrate with it, had to look like it .... because ... Steve Jobs said so ? Then came Vic Gundotra with all the high craftsmanship of political empire building maneuvering and little else. Probably broke the original internal culture if it had survived till then (not sure).
The new google news page is awful. It's classic user interface designer making stuff pretty with no thought on how to display news. The old page was much better, it was denser. I friggin hate Google news now and am looking for a better news source. I'm not loving text only, I'd be happy with a text only top level and then text & pics for the stories.
The poster is maybe referring to James Damore's interview with Joe Rogan, where he claims that Google executives admit to diversity procedures in hiring.
Disclaimer: I have no opinion on this topic. I am merely speculating the potential source of info.
The Swedish public broadcaster still keeps their Teletext[0] service running. One of the cleanest way to get your news (especially on a TV where load times are zero). You can access it on the web as well https://www.svt.se/svttext/web/pages/100.html
It even has a mobile app and it is the fastest loading news app I know of and it loads even with the slowest connection. Due to restrictions of the medium, the text is short and to the point. As there are no analytics with Teletext, clickbait has never arrived there. More sites should follow that model.
I'd add that Google News used to have that clean feel too, many years ago. Each generation it gets worse.
The latest generation using AMP is nearly unreadable on my Nexus 5 phone, it bogs it to a crawl.
A while back I did a comparison of the code behind the major search engines' landing pages, and found that DuckDuckGo is by far the cleanest, simplest, and most accessible, yet they still manage to make a living on text ads in their results.
Another easy way to see the difference is to load each site up in a text-only browser like links. You'll find that DDG has the cleanest interface there, too.
I still use Startpage most of the time due to the generally better search results, but DDG is my backup and sometimes finds things Startpage (i.e. Google) doesn't.
Duckduck go is a pretty good engine for 90% of searches.
I wonder however whether its (perhaps tenuous) connection with Yandex and Russia should give pause? I admit I have only read this in a headline several years back.
Well, Yahoo was designed to be a web directory, which eventually evolved into portal (until mid of 2000, Yahoo search would still show "directory").
Google was all about search at the beginning. I still went to Yahoo and MSN for news and information. But I think some time after mid 2000, a lot of folks began to shift from going to portal to get information, to typing keywords "news" in Google search. Somehow we are hooked to typing keywords. When Google finally released news.google.com now users could have a quick navigation of current events. This behavior is manifested in the era of social media. So many people are now getting news from logging onto facebook / twitter. To verify, as a smart reader I would search on Google, hoping to find a full version from reputable news sites.
In some countries/culture, portal is still preferred. e.g. Yahoo Japan being one. There are still some values of Yahoo.com; I still go there if I am looking up finance news or some pop-culture entertainment news (no other new sites do better than Yahoo on entertainment news broadcast).
The second reason was the predominant reason. We already had a lot of options for search results with search engines like Altavista/aggregate sites. It's the simplicity of Google that made Google.
PS. It's also the reason Google now is increasingly less special (but they do try to keep it simple when they can, subtly).
I'm not sure that was the predominant reason. The search results were also a hell of a lot better than AltaVista, Lycos, Yahoo and whatever else was around back then.
Google was a huge improvement because if you searched for Foo Bar Baz it would only give you pages that had all the words Foo, Bar and Baz whereas AltaVista et al. would give you pages containing any of the words, if memory serves. But the speed was a huge deal too, especially on a 56k modem (and the thing had to be rendered in IE4 on a Pentium II too, let's not forget).
AltaVista had a full set of boolean operators, so you could search for [Foo AND (Bar OR Baz) AND NOT Quux], which I still miss. Google appeared just when AltaVista ‘graduated’ from a showpiece for the Alpha to trying to make money and therefore turned to crap.
I don't know why, but many times Google just doesn't load here. I always open a private tab in Firefox to get rid of cookies easily, then load Google, and I don't know if this is the reason but after a while it just stops.
Duckduckgo is (for me) the new google in terms of speed, but the search results are not the same quality.
Duckduckgo is very slow for me when it comes to opening things in a new tab(Safari on a laptop). It takes several second for the right click options to come up when right clicking.
I'm not sure if the reason behind this is because of possible weak phone connections or that the hurricane is a good way to promote the text only site, but either way, huge shoutout to CNN for going forward with this themselves.
I've thought for a while about scraping news sites to just show their text, or something like classifying articles based on their subjects from different ones. On this front, they did it first.
All the major sites switched to something like this when the internet slowed to a halt. I assume they've always had a minimal version ready in case something like it happens again.
I was working at AOL Time Warner on 9/11 and our team helped out the CNN team to keep that site online. It was a crazy experience. I recall that Sun Microsystems had provisioned a huge datacenter room full of gear for another AOL project as a sort of "try before you buy" deal. We spent the afternoon kickstarting those machines and bootstrapping them as CNN content servers. It was a huge struggle just to keep everything online, given the massive surge in traffic that we were seeing. America had never seen an event like this since Pearl Harbor and most people didn't know what to do besides standing in front of a TV or--if you were at work--refreshing CNN all day. The idea of infra capacity to handle a news event of this magnitude wasn't something that had even been considered prior to 9/11.
It's not text only but it's considerably more minimal than most news websites, loads almost instantly, doesn't have auto play content, and is considered a reputable source.
The BBC used to be similar but their international website is awful now (slow to load, more adverts, and less emphasis on actual news).
The full npr.org provides transcripts for many or all of their stories. This site leads me to stories that only contain bylines and no other content. Seems like a bug to not include at least a link to the audio or the transcript.
Just use ublock origin in advanced mode and disable third party everything. Most sites look completely broken until you find and enable their one cdn (and only that!), meanwhile 100 other requests stay blocked. Save those rules as you go and it's easy browsing from then on!
NPR developer here. We launched some changes to the text site this morning.
NPR set up a text site in late 2001. A developer reworked the text site in 2005. We have made very few changes since then.
The site is a set of a few PHP scripts querying our MySQL CMS database directly with no caching. The HTML and JS were aimed at providing a decent mobile experience in 2005. That included a lot of hacks and workarounds for extremely defunct platforms.
The newsroom asked us to make some improvements. The main features:
- display more news stories following editorial order rather than reverse chron,
- remove the obnoxious interstitial "read more" view for stories,
- put the text site behind Akamai like our other web properties,
- configure HTTPS on the text site (still in progress and won't be the default due to TLS overhead)
It should be a lot faster and more pleasant to use.
Thank you to everyone who made lots of noise about CNN's text offering. There is a small contingent of developers at NPR who love the text site to the point that they've created replacements as personal projects. They were very excited to improve it for the public!
if you click Topics -> News[0] it takes 2.7 seconds. They should throw that subdomain behind https://memcached.org/ (and not use php lol).
Unrelated, seems like I can't listen to podcasts. There's an <audio> tag (which doesn't use data unless you click on play if you set preload="none" [1]), so that's an interesting choice.
$ curl http://thin.npr.org/t.php?tid=1001
[...]
Total wall clock time: 2.7s
$ ping thin.npr.org
[...]
round-trip min/avg/max/stddev = 76.407/78.359/86.985/2.442 ms
It wouldn't stop surveillance, because there are only a limited number of pages it can serve. It's possible for an attacker to download all of them and figure out what you're reading by traffic analysis.
>To overcome government censorship and surveillance.
But the site isn't being censored? Also https won't stop the government from knowing that you connected to those servers. I agree that we want to avoid censorship and surveillance in general, but it really doesn't seem relevant here.
>To stop internet providers from injecting ads and tracking scripts.
Is your ISP actually doing that to you right now? Or is that just hypothetical?
> Is your ISP actually doing that to you right now? Or is that just hypothetical?
I was on a Southwest flight earlier this week which did exactly this, using HTTP injection to display an overlay on every HTTP page. It's certainly useful to provide flight information (or Amber alerts, weather information, billing alerts), but it's Just Wrong™ to violate the integrity of a communication to do so. Perhaps there should be some standard protocol for ISPs to send messages to clients, permitting the connected OS to determine how to display them?
I can't remember which country it was, but it was either a Vodaphone or O2 sim which would inject their little banner at the top of websites that weren't HTTPS. It was super annoying, especially seeing it on my own site!
A café I go to sometimes for coffee injects ads to non-HTTPS websites. Full screen ads with a timer. It's a good reminder that HTTP sites can be and are being arbitrarily manipulated and surveilled by WiFi operators.
The absolute worst thing about cnn.com is the auto-playing videos. I've been in quiet places and clicked a CNN news link only to have a video start blasting on the speakers. It's not only annoying but a huge waste of mobile bandwidth.
Now I just need an extension that rewrites all CNN urls to this site.
0.0.0.0 ht1.cdn.turner.com # Autoplay video
0.0.0.0 ht2.cdn.turner.com # Autoplay video
0.0.0.0 ht3.cdn.turner.com # Autoplay video
0.0.0.0 ht4.cdn.turner.com # Autoplay video
0.0.0.0 ht5.cdn.turner.com # Autoplay video
0.0.0.0 ht6.cdn.turner.com # Autoplay video
0.0.0.0 ht7.cdn.turner.com # Autoplay video
0.0.0.0 ht8.cdn.turner.com # Autoplay video
0.0.0.0 ht9.cdn.turner.com # Autoplay video
0.0.0.0 a.teads.tv # Autoplay video
0.0.0.0 t.teads.tv # Autoplay video
0.0.0.0 cdn.teads.tv # Autoplay video
Not all those hosts are populated yet, but this is effective.
How do I disable autoplaying video on my Android phone though? It's really annoying that news sites think it's ok to use my mobile bandwidth with redundant videos.
That's the absolute worst thing about EVERYTHING on the web. It's usually a video of absolute nonsense too, for example just bullet points from the text below. As if we've got shitty powerpoints, with sound, that play every, single, page. Drives me insane.
I play music while I work. I got tired of all the programs that insisted on using sound effects or playing other sounds, so now my computer's speakers are mechanically disabled, and I have a separate system for music.
I'll connect it to my sound system if I want to hear something from the computer, which is rarely.
EDIT: As I wrote this, someone deployed a server-side rendered version of the site. Now the site is perfect. :)
This site appears to load ~350 KB of JavaScript, which I think is a bit excessive for a "lite" text-only site. From the sourcemaps, I found a long list of libraries, including:
* react
* redux
* redux-thunk
* react-router
* axios
* base64-js (why not window.atob/btoa?)
* core-js
* fbjs
* react-hot-loader (should not be in a production build)
* ...a bunch of other smaller modules
There's only about 10 KB of non-library application code. Note that I ignore gzip when evaluating this sort of stuff, since that many bytes of code still need to be parsed, no matter how much it compresses.
For the person who made this site, I would replace React with Preact (or Inferno), which should remove most of the bloat. Server side rendering would also be nice for those who don't have JavaScript enabled and would also improve the loading time.
This website contacted 65 IPs in 7 countries across 35 domains to perform 314 HTTP transactions. [...]
In total, 4 MB of data was transfered, which is 12 MB uncompressed. It took 2.753 seconds to load this page.
This reveals why I didn't see anything on my phone on first time visiting it. My browser just show a blank page, then seconds later the contents show up.
FWIW, base64-js and similar libraries are still needed (though probably not in this case...) because window.btoa() only works on latin1 ASCII strings. Try giving it some arbitrary UTF-8:
btoa("\ud83d\ude0b") -> Uncaught DOMException: Failed to execute 'btoa' on 'Window': The string to be encoded contains characters outside of the Latin1 range.
(Edit: apparently HN doesn't either; changed to \u...)
You are 100% correct. Honestly, it doesn't need CSS even, although that's nice to have.
I've got NoScript on, and now the pages load perfectly. I think I may return to being a CNN reader now. If only they could figure out how to monetise this.
For the rest of the sentence... I was clearly referring to the question of why someone would use a library for base64 versus the built-in function. Not getting involved in the anti-js internet pitchfork mob.
Because when you're treating React as a templating language, it's easier to let it run on the client side during development and add server rendering later.
i don't know react. Can you elaborate what you mean using React as a templating language? Are you saying react can be used sans any business logic and purely for template interpolation?
* Generate a HTML/DOM based on a template and data.
* Lets you efficiently update that DOM based on updates to the data and/or model (this is it's killer feature).
But if you're just going to do the first thing (generate a HTML/DOM), you don't need most of React's feature set, and you certainly don't need to push heavy JS onto the client.
For those cases, a much better approach (for both clients and SEO) is to just pre-render the DOM/HTML server-side and serve that without any need for any javascript client-side (which is what the similarly named module "Preact" does).
Josteink said it excellently, my addition is that React itself is a view library that lets you update DOM easily based on some data, and the data (and the logic behind it) can be and often are completely separated from React components themselves.
Seriously, I hate this trend to bloat up websites with the rage of a thousand suns. There is absolutely no value, but it increases bandwith usage, load times and CPU/RAM usage. News websites having "fancy, modern" websites actually make me think less of them.
Agree that some sites are producing some really high-quality work leveraging the modern web. The problem, for me, is that many sites think of everything as _content_ that must be flashy (but hopefully not Flash-y). If I click to read an article about the devastation in Barbuda, I don't necessarily need the full production of related content to be thrown in my face along with the text that I really want to read.
This is the highest-impact format with which one can consume news, though. Magazines are great, but lots of people are just looking for a quick shot of the current news. This sort of UI is ideal for that need.
This is exactly why the Drudge Report has stayed relevant for 20 years.
death to auto-playing video always and forever... CNN's site is especially bad about that, as I've experienced cases where I stop the "big" video at the top of the page and scroll down, only to have it start back up again when it moves to the sidebar.
Agreed, it is one of those irrational triggers for me where I go from fine to furious in maximum reaction time. Just imagining it happening gets my blood pressure up.
How about a chrome extension where you can put domains on a time-based blacklist for offenses against your sensibilities.
So if you go to CNN and they autoplay an ad, you can blacklist them for 1 week - Should you forget or carelessly click one of their links, this pops an interstitial page that reminds you of their offenses and gives you the opportunity to rescind your request for a page view.
Additionally there is an opt-in for you to report the offenses to a central service for public scrutiny/shaming... thoughts?
You probably want the "HTML5 Autoplay" Chrome extension which stops the overwhelming majority of these. It was a problem on Bloomberg as well as CNN, but now you'll see the window spin for a second and then stop.
I wonder how many HN readers would be unemployed if the world went this way. It's pretty interesting to think that a good chunk of the people you comment with are the ones coding bloated websites (most likely not because they want to!).
I actually think that this would be one of the best outcomes of something like UBI. Devs and designers would no longer have to keep redoing websites just to justify their paycheck. They could spend times improving their sites only if they felt it was needed.
It's interesting to me to see shifting viewpoints around this. I've been building the Bloomberg Terminal for a long time, and to both users and programmers coming from the web ecosystem, there is often an immediate (somewhat) negative reaction as to why anyone (really any business, not person) would pay so much money for what the software provides. But once in a while, when you see something like this come along, you see viewpoints specifically pertaining to news and what everyone wishes they had was clean text-and-basic-photos delivery of relevant news without any ad-, web-, tracking-, social- bloat of the modern <insert any news corporation> website. That is specifically what businesses are paying for, and it is just one feature; what they get for their money is much larger than just one or a dozen news sites. They get something on the order of 100k feeds (e.g. WSJ is 1 feed) in dozens of languages from around the globe in all their text-only glory. So yeah, it could look like a throwback from another era. Or, alternatively, it could be exactly what everyone wants :)
I think there is a happy medium. For my tastes, I think the text-only cnn is too bare. I don't need pictures but I could use a one or two sentence summary of the stories to give me more of an idea whether I want to read the whole thing or not.
"This is what the world needs." maybe the world needed back in 1983 ted turner never to start up 24/7 news? Didn't and doesn't human DNA jive better with less news, less often?
it's more than that. We used to have a morning edition newspaper and then the late edition. That's it. Two times a day to get fresh news. Even 9-5 too much.
No value to the user, but all that bloat allows for analytics and tracking for more ad $ and the images make emotional manipulation easier to get more clicks. Not to mention that 99.9% of people don't care or don't even know what bandwidth, CPU, or RAM are.
There is a signifiant value prop for this type of clean interface when considering users that use accessibility tools to browse the web. I had the pleasure of working with an engineer whose vision was impaired. Truly inspired with all the self-engineered tools and software that he created to sanitize sites of all the bloat for his consumption with a screen reader. This type of site will make his life a little easier.
I feel your pain, I thought it was either a Convolution or a Cellular neural network, was very disappointed. But thanks for pointing to some actual good material to read and look over.
To add to your trend, here is an article [1] about "Texture Classification and Segmentation by Cellular Neural Networks Using Genetic Learning" , texture not text though as they are more vision related, though it would be interesting to see by what methods they could be applied to text.
Is there a collaborative community effort for "fixing" websites to make this "100 steps forward"? The relative ease and the large benefit of doing so make it surprising I haven't seen a list of lite websites maintained by volunteers.
In my understanding, Firefox Reader Mode (and similar features in other browsers) reads in the entire page and, based on analyzing the page for cues, then decides to offer the user an option to choose reader view (or not).
So while it helps improve readability, it doesn't cut down the time to load, the amount of data downloaded, and probably helps improve the battery life only marginally (this is debatable, depending on the amount of active JS, auto-play videos and also at what time offset t the user chooses reader mode after the page loads).
I love this. Pure information without bloat. One of the reaons I love "Teletekst" "teletext"). It also looks very nice, despite the fact that is 30 years old. 101 is news. 818 is domestic football. It's actually still very popular in The Netherlands, and the app is one of the most used apps here.
This is awesome. Major props to CNN. Reminds me of an article that appeared on HN not too long ago about Conde Nast needing Google AMP because their site is so bloated.
Does this belong to CNN? It's great to read news without clutter, just clear contents. I hope there are many sites like this, it saves you from load the whole page then click the reader mode or similar.
I said they use the same CDN hosts with the same forward and reverse names. Which I find quite convincing, on top of everything else.
dig lite.cnn.io
lite.cnn.io. 18 IN CNAME turner-tls.map.fastly.net.
turner-tls.map.fastly.net. 20 IN A 151.101.1.67
dig www.cnn.com
www.cnn.com. 39 IN CNAME turner-tls.map.fastly.net.
turner-tls.map.fastly.net. 18 IN A 151.101.1.67
I realize this kind of defeats the purpose, but I applied just a tiny bit of CSS (and Open Sans) to make it look slightly better. Here's the userstyle in the hope it's useful to some:
An important feature is that the article pages don't have an article title. Annoying if you like to open a set of pages from the main page to read as an initial action prior to reading. Current pages at lite.cnn.io don't seem to have extra JavaScript.
This might be counter-intuitive to types who comment here but the CNN Snapchat thread should also be appeasing to you for the same reasons this text-only site is. You go through their stories until you find the one you want and that's all you get. No ticker tape. No flashing "BREAKING NEWS" decorations and no fluff. It's the same concept only in a visual medium.
This has been top of hackernews all day. Funny how as we move further and further into the future, developing all these interfaces as we go, we still can't beat a format that has been here since the beginning.
Brilliant. This achieves some key user-centred aims; content is easily navigated, and given prominence above any other feature. I wonder if the addition of aria-roles might add some useful context for screen readers?
In an age when personal assistants are gaining populating, maybe sparse text-centred interfaces are set to gain more popularity?
This is better than CNN.com, but the big problem with CNN is that it simply isn't very good. If you're haven't tried it again, let me again strongly recommend a subscription to the Washington Post, the Wall Street Journal, or the New York Times. You might be as surprised as I was, once I was free to click around the sites without thinking about paywalls, how much better the reporting is at real newspapers than it is at free news sites.
Honestly, with a few tweaks (RSS support, titles on pages, getting rid of the 'Breaking News, Latest News and Videos' in every title) I'd pay $12/year for something like this. I'm pretty sure that CNN gets less than that from a statistical viewer of its pages.
Wouldn't it be awesome if pure text, static pages became the new normal, and we looked at video-and-image, JavaScript-laden SPAs in much the same way that we look at parachute pants, bowl haircuts and rhinestone-studded jean jackets?
Give it time. The original CNN site, many years ago, wasn't as slow and terrible as it is today. I'm not convinced history won't repeat itself if this version became popular.
RSS is still available on a lot of websites ... and for the ones that aren't available using RSS, you can use a third party service that will create a feed.
They're still loading an analytics javascript file that is 230KB minified. That's literally two orders of magnitude larger than anything else that gets loaded from that URL.
I messed around with aggregating a few big news sites' RSS feeds and generating a static site from them after being in areas with choked cell reception. Turns out it's still plodding along!
Nice format, hopefully with some real unbiased content but I am not expected that at CNN though, nowadays the news has to be a mean value of (Reuters + NYT + CNN + Foxs + ....)/N, but I just do not have enough time to read them all, so I read HN instead.
It's great that they included an easy toggle to Spanish language. If you're trying to learn Spanish and already know English or vice-versa being able to read topical news every day and switch back and forth easily for both languages is a great tool!
The analytics company I used to work had a private news archive for pretty much every news paper/magazine you could imagine. All text, all searchable. I've long wondered how come no one makes a site like that for everybody.
The link to the full CNN experience at the bottom of each article doesn't deep link to that article's full experience. It just goes to the cnn home page. big fail
i'd love to make a little userscript to redirect articles from cnn.com to here, but i don't think there's any way to extract the .com url from the .io pages.
Thanks; you should mention that in your comments. There’s some irony in the fact you hide your affiliation when promoting a website about independent news.
This is cool but for some reason (maybe reading too much HN!) I clicked and expected a demo or research paper about a Convolutional Neural Net (CNN) that you can feed lots of text and it would figure out the content and semantic relationships of all the texts on its own and it would get better with every new text that you feed it.
Actually I wouldn't be surprised if someone's already done that...
This is awesome. I really don't understand why a new page would load so many stuff and then it hangs at some place. Sometimes I was wondering whether it was loading the whole world. Maybe we should take some steps back now to keep our site SIMPLE but COMPLETE. Just like Hacker News itself or Lobsters(https://lobste.rs) or Pxlet(http://www.pxlet.com/) so that we would not be distracted too much.
Maybe technically Google's AMP is a good help on boosting the loading speed but it only works on mobile devices and it takes much effort to make the site adaptable to AMP.
Usually a modern website would load lots of third party libraries, ads from different advertising platforms, social media sharing modules and analytic modules etc. All these would make lots of HTTP requests and increase the response time. How would they not be SLOW?
My laptop can run computationally-intensive physics simulations and play streaming hi-def video with no problem, but if I click a link to a mainstream news article it just about keels over and dies. How did our grandparents ever manage to load the news on an underpowered machine called "paper"?
Taking out a full-page newspaper ad is an attention-grabber. Taking out a full-page pop-up is merely an annoyance. Why does this perception exist? Why isn't digital space regarded on-par with paper space?
Because of this, digital ads sell for much, much less than paper ads. And so ad-supported companies have to use 10x as many ads to get 1/10 the revenue. And these ads can gain more info if they run in the client.
I use uBlock Origin, uMatrix, DuckDuckGo, and HTTPS Everywhere for a mix of adblocking and privacy. uBlock Origin lets you block specific or wildcard DOM elements - literally anything on the page. Sometimes I even block annoying photos that are not ads.
uMatrix lets you block or allow different file types from different domains. By default, you get all CSS and Image files from any site, and all Javascript from the original domain. The drop-down makes it easy to update the allowed file types, then just refresh the page. (The downside is sometimes each refresh will just load 1 more script, so it takes 15 minutes of adding another domain, refreshing, domain, refresh, until the whole capcha or all images load.)
That and the abomination of the technology stack that we use for said process. HTML/CSS/JS is absolutely not the answer for "I want to make a UI", never mind the fact that many UI designers have no clue how the stack works and abuse it horribly.
The idea may be "paper", but what's actually being done is far from it. The browser may paint and repaint a pixel a dozen times before it's ever rendered. DOM-traversal is one of the most important parts of a browser's optimization because of the tens of thousands of times it can happen to paint a screen.
I think it's a recent phenomenon and not related to rendering of ads.
I use Chrome and it's quite common, but not always. I would turn up the developer console and find thousands of errors and warnings when I notice the tab icon keeps blinking. The errors are mostly blocking of third-party plugins, security domain violations or similar content policy enforcement. And it just keeps reloading the same requests.
I don't have any ad blocking or any extension installed so this should be a wide spread thing. It's probably overzealous ad providers ignoring Chrome's security policies.
If only CNN would go back to being the old but trusted, amazing CNN, with Christiane Amanpour at the landing in Mogadishu during Restore Hope, or Bernie Shaw at the hotel in Baghdad for Desert Storm.
Now, a shell of its old self, CNN is just MSNBC's cousin, with people who say "we, ahem, I mean, the Democrats, are losing Michigan" live on screen.
I think they appeal to the majority sentiment which sways dem or rep depending on who's in office. They were much more center-right while Obama was at the helm.
Ah, the good ol' days, when CNN didn't have CIA minders moderating its releases and directing its reporting policies, because of, y'know .. "the war on the TERROR" .. before The Patriot Act came and ruined America.
Yeah, I remember those days too. While I'm glad that CNN has become more accessible with this text interface, I'm still not going to read it. It is, without a doubt, highly unreliable a news source, and not something one should be using to base ones world view on ..
We've asked you many times to post civilly and substantively here, and since it that change doesn't seem forthcoming we've banned this account. We're happy to unban accounts if you email us at hn@ycombinator.com and we believe you'll post within the guidelines in the future.
You know hacker news people live in a world of their own when a post regarding stripped down Cable News Network website gets more up votes than all the posts on convolutional neural networks combined
The first: Good search results. Yeah everyone knows that right. But young people today might have forgotten the second reason we liked Google search back then:
The Second: A clean page with a single logo and search field. It loaded quick. There were no banners everywhere, no bs.
HackerNews has that feel. It is clean, information dense, and does what it needs to do.