Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm glad that they're finally coming around to the realization that Wikipedia has become increasingly closed to new contributions, and that they've stopped touting the (patently absurd) hypothesis that new users just don't "get it." (The fact that they'd even think, let alone think first, to blame the users is just a giant head-scratcher).

As a simple UX experiment, I would ask new users this: try to contribute substantively to any article on Wikipedia. Just try it. Make a good-faith, high-quality edit to a page, and see how long the edit is allowed to stand. More likely than not, the contribution will be automatically reverted, within milliseconds, by a bot. If it's not, it'll be hand-reverted by a hardcore Wikipedia editor -- part of the statistically small, but disproportionately powerful cadre of self-appointed content cops, who seem to see their jobs as being bulwarks against change. In its zeal for the trappings of due process -- attributions, anti-"vandalism" policework, source checks, guidelines, and so forth -- this clique has lost sight of the net effect it's had on the site, which is to calcify and close off the free exchange of information that was so crucial to Wikipedia's early growth.

IMO, Wikipedia has faced a fundamental challenge in recent years: namely, that content-quality efforts have threatened new content volume. I don't envy this strategic predicament, being forced -- quite literally -- to choose between quantity and quality. It's not an easy balance to strike, and, given the circumstances, Wikipedia's historic track record is quite admirable. Recently, however, the balance has tipped too far in the direction of quality-policing. And now it's starting to undermine the core tenets of the project. I remain optimistic that Wikipedia (and/or the Wikimedia Foundation) can right the ship. But it'll have to mean a substantial uprooting of some bad seeds that have been allowed to take hold for years now.



I've done it. Without a doubt there is a high barrier to entry, but with anything you have to start small. Find articles that aren't fleshed out already. Find things that are entirely incomplete. Dive into those articles first, those are the ones that need attention, not Barack Obama's page. I started getting involved with wikipedia about a year ago. If you start small and create a history of trust, older users are more likely to trust and approve your edits.


I agree that you should start small. My contributions have been minor alterations to articles on obscure or esoteric topics. Like fixing a spelling error in the article for "A Pup Named Scooby-Doo." (Although that article is maybe more active than you'd think it would be.)


I'd argue something even worse than what you are positing. The folks who've stuck around and made contributing to WP nigh impossible are also the same who make strong statements about turning WP into what's essentially just a digital copy of a traditional encyclopedia, thus artificially constraining it in both breadth and depth even though the storage of the material (bits) is almost free, while the print cousins were constrained largely due to cost and physical size.

WP is free of both of those constraints so why are their legions of self-appointed bureaucrats trying to impose those constraints?


Because the articles have to be concise to be readable? For example I just stumbled into the article "Tinnitus" (http://en.wikipedia.org/wiki/Tinnitus) which contains a "list of notable individuals with tinnitus" (given that tinnitus is common, this is analogous to a list people who like the colour beige). Makes the (medical) article a bit longer, but crucially it fills the references sections with totally irrelevant links.


What would argue against making it its own article? Instead of deleting, these parts can just be separated- and storage is only getting more plentiful by the year.


A "good-faith, high-quality edit to a page" has no reason to be reverted automatically by bots or hardcore editors. If the content is encyclopedic and not controversial, it will be "wikified" by contributors that know more the syntax.


See other comments on this thread about fair-use image bots, which have caused considerable anguish on Wikipedia. They run fast, and template many people. Fair use is important, and there are many images without a compliant licence, but most people have no idea what counts for fair use.

EG: I have some toys. I take a nice photograph of those toys. Can I release that under a completely free licence and upload it to wiki? There's a famous statue in my home town. I take a nice photo of it, and release it under a free licence. Can I upload that to wiki? What about a building? etc etc etc.


For what it's worth, this is just restating some research that's been ongoing since at least 2008. I think the final nails in that coffin were established in the "Summer of Research" in 2011. There's no doubt that the decline in editorship is due to endogenous, not exogenous factors.

The WMF are not "finally" coming around to this realization; they are trying to convince the rest of the community to make it a higher priority, and to publicize the efforts they're making to the press.


(The fact that they'd even think, let alone think first, to blame the users is just a giant head-scratcher)

Logically, I can see this. However, when you survey the world, you see a whole lot of blame directed at users. (With certain tech/business subcultures as notable exceptions.) I'll note that when our brains evolved, there was no software nor was there UX, and the only suitable targets for blame were people and critters. I'd bet this is a human cognitive bias.


Largely agreed, but if you'll permit me to get cute, I'd suggest that's there's always been UX. Whether in city design, the evolution of tool usage, invention, product design, etc. UX is at the very heart of why we, as humans, always seek to improve upon our lot.

The difference with software is that, for the first time in our history, we're able to measure, isolate, quantify, and control the elements of UX better than we've ever been able to. (It helps that software is often experienced in isolation from its environment, so UI can be more closely correlated with UX than it is for other domains). UX was a fuzzy, ethereal, probably subconscious concept that only recently became a serious discipline. But it's always been important. And our brains have evolved to conceptualize it, albeit intangibly until now.


Largely agreed, but if you'll permit me to get cute, I'd suggest that's there's always been UX. Whether in city design, the evolution of tool usage, invention, product design, etc.

Permission denied. :) Everything but tool usage is in the realm of cultural evolution. For the simplest tools, most all of the error is user error. There's simply not so much functionality in a stick that isn't mostly dependent on user actuation.

our brains have evolved to conceptualize it, albeit intangibly until now.

That's my point. UX flaws that are independent of user error have been largely intangible until now.


Define "cultural evolution," though. I'm not sure what you mean there. Not trying to be difficult, because I rather enjoy this conversation. Just unclear about the distinction you're drawing.

I'd argue that UX design -- even if it didn't have that exact name -- has been a distinct discipline long before software. Just ask anyone in the food service industries, the retail industry (department stores were basically innovations in UX in retail; so was IKEA), the casino gaming industry, the amusement park industry, and so forth.

Casinos, in particular, are fascinating UX case studies. The person who first thought of modern casino layout, comping free drinks at table games, oxygenating the gambling floor, removing clocks from the walls, comping rooms and other amenities for big spenders and regulars, which games to place adjacent to which others, etc., was a UX designer in spirit if not in title. And those decisions were pretty rigorously tested and quantified. These things may not meet the technical definition of UX as we commonly speak of it on HN, but they certainly hold with the spirit of the discipline Don Norman would later come to articulate as "UX."


Define "cultural evolution," though.

As for a definition, I'm talking about the evolution of individual behaviors as transmitted through culture. If one somehow rendered the entire human race sterile, but we continued to propagate ourselves for the next 2000 years through cloning, you'd still have "cultural evolution."

I'd argue that UX design -- even if it didn't have that exact name -- has been a distinct discipline long before software.

Again, I don't disagree. That you bring this up indicates to me you've missed my point.

food service industries, the retail industry..., the casino gaming industry, the amusement park industry...

All of these predate most of the evolution of the human brain's structure and capabilities. It's somewhat true that there were "UX errors" before the stone age. I say "somewhat" because it's really hard to delineate these as entities without a certain degree of technology. When all you have are sticks, what is the error of the designer and what is the error of the user? Maybe the user's just "holding it wrong?"

It seems to me, that we're likely to assign blame to sentient and animate entities. And even if the stick wasn't "whittled correctly" according to Thag, maybe it's just fine to Ookla? It's just hard to talk about "UX errors" as quantifiable entities until we get standardized production and large sample sizes.


If I understand your original point correctly, it's this: it is human nature to attribute most things to user error, ergo, my assertion that Wikipedia's blaming its users was a "head-scratcher" was off the mark. Our evolved inclination, which predates the discipline of even considering UX as a tangible -- and, more important, a controllable -- concept, is first to start with the hypothesis that the user is in error. (And, furthermore, that such a hypothesis is not necessarily unjustified by historical frequency).

I actually agree with you here, but I think this point and mine are not so much at odds, as they are orthogonal. My point is that, human nature or not, Wikipedia came about in the modern era. Even if our cognitive bias/inclination is toward blaming the user, we have tools and analytic frameworks at our disposal which exist precisely to allow a necessary check against our brains' heuristics. Those checks should have been run by the Wikimedia elite. While I'll admit that "head-scratcher" is an unfair description, rendered mostly for rhetorical effect, I believe my point still stands. We have modern tools at our disposal, precisely because we are now -- uniquely, in our history -- aware of our brains' strengths and weaknesses in pattern recognition and situational assessment.

If this is not an accurate summation of your position, then I'll freely admit that I'm missing your point.


No, that's it. Strangely enough, we were basically agreeing the whole time.


Make a good-faith, high-quality edit to a page, and see how long the edit is allowed to stand.

I guess I haven't found this, but I admittedly don't try to edit [[George W. Bush]] or articles like that. The common case for me is crickets: I create an article on, say, a 19th-century German author who had a de.wikipedia article but no en.wikipedia one yet. Short article, maybe 2 paragraphs, one reference. The usual outcome is that I hear from nobody about the article ever again. No complaints, no praise, no reverts, no improvements, no anything at all. Exceptions: 1) it might get tagged as an "orphan" if nothing links in; and 2) some people may fiddle with the window dressing, adding/modifying categories and infoboxes and whatnot. Once in a long while someone will come in and expand it greatly, and that's usually positive.

My recent hobby has been articles about archaeological sites, and that has the same crickets-type feeling. So I'm wondering what new users are editing that they get any comment at all, much less angry ones! In my experience picking up a book and creating short new articles that reference the book is more than enough following-the-policies to avoid complaints about your articles, assuming it's some kind of legit reference book.

If anything, there are a lot of edits that should be challenged that aren't, with the bar for getting spam into Wikipedia not really that high. I know of at least one university that actually has a paid staff member creating total PR-puff-piece articles about that university's professors, and few of them get challenged, despite the fact that they read pretty much like a PR person wrote them, and are "cited" mainly to the subject's own articles and university press releases. Some academics edit Wikipedia solely to insert references to their own papers in articles. There are also rumors that publishing companies have people inserting references to the publisher's recent books, as a form of advertising rather than because they in good faith think it'll improve the article. Heck, check out the External Links section on popular vacation destinations; around 20-30% of them are full of travel spam, which isn't even that subtle, and yet manages to get in. Maybe all those false-negatives (not reverted) are worth not having more false positives, but it's a hard problem overall in both directions imo.

Really general articles, like [[global warming]], do have more fundamental problems, but I think some of them are just unavoidable. With anything controversial, there will be 50 different opinions about how the article should go, and the only way to reach a compromise is to discuss it and try to figure out how to best organize the material, split some material out to subsidiary articles, word controversial points delicately, etc. If someone who didn't read that discussion comes in and makes an edit, it's likely to be problematic on one way or another. I don't think that's even Wikipedia-specific; that's how things work when it comes to controversial subjects when people are writing academic review papers, subject-matter-specific encyclopedias, Linux kernel contributions, etc. Wikipedia's article on global warming is at least probably less bureaucratic to contribute to than the IPCC report is. ;-)

[edit: The above doesn't mean I don't think that there are problems with Wikipedia, which I do, but I think they're not quite as "it's a completely horrible community" as is often suggested. Some of the things, like reaching consensus on a controversial subject, are just inherently hard; other things that are broken for no good reason should indeed be fixed, others depend strongly on what kinds of articles you create, etc. Overall Wikipedia's newbie-friendliness actually seems pretty good relative to other collaborative projects I've been involved in, though.]


Really general articles, like [[global warming]], do have more fundamental problems

As a wikipedia user (edited only typos & punctuation), I don't see these problems. I think http://wikipedia.org/wiki/Global_warming is very informative and reasonable, more so than any other page I can find immediately with google. The talk page is also very good.

Can you be more specific about the problems you think it has? I have trouble seeing wikipedia as anything other than a mind-bogglingly amazing 'shit i woke up in the future' thing :-)


When it first started you could have written an article on cars or some other easy topic that a of people know about. Now the topics are insane that are left. (While there are still some easy stuff left, there isn't nearly as much.)

Because of the nicheness of the topics it will get increasing difficult to write on.


Not really; the niche topics are the easiest to work on for the most part, because you're unlikely to run into anyone else.

The hard topics are:

- traditional conflict areas (Religion, Politics, Race)

- broad-level topics (I tried to work on Computer once, bad mistake...)

- Current events

There is an awful lot of low hanging fruit ready to be plucked; but people aren't really interested in them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: