Machine Learning (Theory), http://hunch.net/, is a good place to learn about new machine-learning papers. It doesn't really do research exposition on the blog, but it posts about recent ML conferences, highlighting some of the papers the author finds interesting.
Embedded in Academia, http://blog.regehr.org/, is about 50% personal stuff, but 50% posts on John Regehr's work on C-compiler fuzzing, with some interesting examples if you're into compilers or the finer points of C semantics.
While it's a mathematics blog, Terence Tao's blog, http://terrytao.wordpress.com/, has a lot of content likely of interest to computer scientists as well. In particular, his blog-exposition versions of papers are often a better introduction to recent research for nonspecialists than anything in the official published literature is.
Tomasz Malisiewicz's computer-vision blog, http://quantombone.blogspot.com/, has intermittent but often quite good posts on object recognition and similar topics.
Of course I can't refrain from mentioning my own quasi-blog, http://www.kmjn.org/notes/, though only about 1/4 of it is on computer science (about 4/5 of my day job is computer science, but online essays end up being mainly an outlet for everything else).
Unfortunately I would recommend against all of these (except /r/compscipapers, only because I've never checked it out).
/r/compsci is, in my experience, mostly lower level undergrads who really want to show how much they know, there's a lot of misinformation in comments that gets voted up, and a lot of missing nuance in any of the discussions.
/r/semanticweb is very inactive, almost no discussion
The value there, IMO, is more the links than the discussion. It's not so much like HN, where the discussion itself is half (or more) of the value. But, for a quick, concise list of new links in those fields, I find those Reddits all very valuable.
I would also add Matt Might's blog http://matt.might.net/articles/ we see articles from it on HN from time to time so you may already be familiar with it.
I must be an idiot. It took me a few Google searches and finally stumbling on it by accident to figure out how to subscribe to the RSS feed for Serious Engineering's dynamic Blogger blog, since the "Subscribe to this page..." option isn't available. In case you too struggle, there's a pop-out menu on the right.
Many RSS readers these days are pretty good at feed discovery if you just give them the blog URL. At least, both Google Reader and Newsblur (http://www.newsblur.com) seem to be able to dig up a feed for anything I've thrown at them.
The "How I did it"[0] section of Kaggle's "No Free Hunch" blog is really great for getting practical insights into solving machine learning problems. All the posts are short, and unless you're an expert in ML, will likely give you leads all a lot of new material to learn. The pragmatic bent is what really makes it such an excellent resource, there's a huge gap between the mathematical foundations of ML and the solving real world problems side of it.
A good source of NLP goodness is http://nlpers.blogspot.in/, written by Hal Daume, a CS prof at UMD. He covers recent NLP conferences and also his own work. Some of it is in the area of domain adaptation which is of interest to anyone trying to bring research papers to real world products.
Well, just yesterday I thought to myself: I should compile a good list of blogs for reference, instead of the usual marketing blogs disguised as Compsi(most of them are really just trying to sell you something, not discuss real CS). Thanks for sharing.
CodeAvengers is an addictive hot new site that teaches novices the computer programming language of the web: Javascript. The site went live last month with 40 interactive lessons and games. The site aims to be the most fun and effective Javascript tutorial on the web. Script your future NOW, With CodeAvengers.com.
Machine Learning (Theory), http://hunch.net/, is a good place to learn about new machine-learning papers. It doesn't really do research exposition on the blog, but it posts about recent ML conferences, highlighting some of the papers the author finds interesting.
Embedded in Academia, http://blog.regehr.org/, is about 50% personal stuff, but 50% posts on John Regehr's work on C-compiler fuzzing, with some interesting examples if you're into compilers or the finer points of C semantics.
Proper Fixation, http://www.yosefk.com/blog/, is by an embedded developer (not academic), and not always about research, but it has some good researchy and expository posts. For example, it has the best concise overview I've found of how SIMT/SIMD/SMT relate (http://www.yosefk.com/blog/simd-simt-smt-parallelism-in-nvid...).
While it's a mathematics blog, Terence Tao's blog, http://terrytao.wordpress.com/, has a lot of content likely of interest to computer scientists as well. In particular, his blog-exposition versions of papers are often a better introduction to recent research for nonspecialists than anything in the official published literature is.
Tomasz Malisiewicz's computer-vision blog, http://quantombone.blogspot.com/, has intermittent but often quite good posts on object recognition and similar topics.
Of course I can't refrain from mentioning my own quasi-blog, http://www.kmjn.org/notes/, though only about 1/4 of it is on computer science (about 4/5 of my day job is computer science, but online essays end up being mainly an outlet for everything else).