Hacker Newsnew | past | comments | ask | show | jobs | submit | pmr_'s commentslogin

72k can keep you alive on sub-standard living conditions, but it will not keep a team of 6-7 talented programmers and artists alive for very long. Not even for way below market pay and working in a garage.

And a single person just cannot turn out something close to a AAA title.


Definitely not. I'm getting at the issue of employees versus founders. I'm responding to the commenters less than the OP. A lot of people are implying that the 72k would be spent on wages. If that was the expectation, then that's part of the failure.

I have friends who've launched XBLA games on much, much less working full time out of a basement. During development, they could barely pay rent. Afterwards, they buy new houses.

I feel like people don't take real risks any more, and seem to misunderstand what actual ownership of the outcome looks like.

Sign of the times?

And the issue of AAA ... well... Get a simple game or playable demo going and build off of that rather than shooting for the stars. Or did we lose site of what an MVP is, too? Sounds like the OP did.


Don't forget that they get a good chunk less than 72k. You gotta factor in KS' cut and the cost of obtaining and shipping the physical rewards.

I feel like a lot of people underestimate the work/cost involved in physical rewards. I remember reading a game devs blog post about the subject.


Definitely not enough cash.

Overwhelmed by rewards seems to be a common failure on KS. The list he rattles off in this case is quite extensive, and the postage cost is all that's preventing it from being sent. I wouldn't be surprised if they have 10k into rewards alone, especially when you account the time spent developing them.


How do you know that? Have you had any practical experience with a large code base using new style (e.g. post-C++11) Concepts? People have been using Concepts in some form for a very long time and even properly documented they can still be a major hurdle to less advanced or interested users.


The Wikipedia article linked in another reply to parent confirms that. The months in which the worms were harvested were named after the worms.


Starting to play sound without user confirmation is a default tab-close. I wont even take the time to look at it. Bonus point for not being flash that hijacks the browser shortcut.


The sound is not particularly intrusive or loud, and you clicked on a link that said "Listen" in the title.

Generally I dislike websites that randomly start playing sound (and follow the same tab-close behaviour), but this was not ill-advertised or unexpected.


As others I had a totally different expectation given the title. I thought it would be a piece about how Wikipedia had some message or lesson that we should listen to. I think the author had no bad intentions and what he build is probably worth looking at. I was just pointing out how a mistake in web design made me completely ignore this website because I felt it violated my control over my computer.


In that case I wonder if we can't solve the problem in the community, e.g. by requiring a [sound] tag in the title or something. I enjoyed this link and there is a risk of people downvoting it for the sound alone that might cause interesting links like it to be removed in the future.


I'd prefer [autoplay], but I support the idea!


I sometimes post links to BBC Radio programmes. I tag those with [audio], but they are not autoplaying.

HN: should I continue to tag those?

I haven't ever submitted anything with autoplaying audio, but I imagine I'd use [autoplay] or similar.


> In that case I wonder if we can't solve the problem in the community, e.g. by requiring a [sound] tag in the title or something. I enjoyed this link and there is a risk of people downvoting it for the sound alone that might cause interesting links like it to be removed in the future.

Solving it in the community can solve the problem of people unwittingly stumbling upon it from here, but it doesn't solve the bigger problem, which is that autoplay is rude. (Surely we've all had the experience of not being able to find the tab from the huge group we just opened that's playing the sound.) It's good if sites see that they get fewer visitors with autoplay than without.


It really should be a browser level setting where you whitelist the sites that are allowed to do it. I don't mind when Youtube does it, but most sites annoy me.


You can't know how intrusive or loud sound is. It has an area effect that normal website media doesn't.


While I agree with the sentiment, in this case all I can say is: You're missing out. I enjoyed the relaxing, ambient-ish sounds combined with tempting links to Wikipedia articles.

I'm not sure there is a good solution to auto-playing. People are fine with it on well-known sites like YouTube, Vimeo, Soundcloud, etc. And most people, when they unwittingly click on a YouTube link, blame the person who made the link, not the site itself. Bizarrely, the target of ire shifts to the site if it's not popular enough.


I just went clicking randomly on Vimeo links and I didn't find a single auto-playing video.

You are right about YouTube and I actually think they should handle visitors that come from outside YouTube differently than visitors from inside the site. In the first case auto-play is annoying in the second it is what is wanted.


How do you know that most people are fine with automatic playback on popular sites like YouTube, Vimeo and Soundcloud?

(Genuine question, as you stated it as a matter of fact. Not trying to be obtuse or annoying.)


It would be good to have a "click to unmute" option as the default for how a browser handles audio. Click to play works well for Flash plugins, it might be good for audio as well.


I wish for a time disabled extension. Basically turning the browser into an html/css based image renderer. No more animated ads, autoplay audio/video. Heaven.



Doesn't avoid animated gifs ;)


I'm sure you're joking, but here's a blast from the past (I'd forgotten about it, anyway): you can set `image.animation_mode` to `none` in Firefox for that.


I wasn't even joking, I'd really love to try purely static (beside the old guard <a> <form> elements) webpages and am delighted there's such an option, this is amazing, thanks.


I'm glad it was helpful! In case you're interested in the other hidden switches that you can toggle if you're willing to void your warranty, there's a reference at http://kb.mozillazine.org/Firefox_:_FAQs_:_About:config_Entr... . (I'm not sure how up to date it is, though.)


Never cared to search for the whole documented list of flags, usually I toy with one or two. Thanks again.


+1.

When I come to HN, I usually read all topics on the front page, then open interesting ones in tabs. After opening 10 tabs, I hear music, then I find the culprit, close the tab, do not look back.

Never play music without user permission.


[flagged]


Why should I adapt my usage patterns to accommodate the shitty behavior of a few websites? Also, are you sure you are in position to tell other people how to use the web? I certainly don't think I am.


If you are randomly clicking on links on a website like this then you have no justification to describe a website as shitty for doing the one thing it is designed to do.

There's a website called rainymood.com and it does one thing and asking for user confirmation first would be quite silly.


We have justification if we're not warned beforehand.


You clicked a link that said Listen to Wikipedia! I know I expected audio when I clicked it.


Listen could imply that we need to listen to Wikipedia's message. I was expected an essay honestly.


+1 I'm not adapting either.


I think the statement following this is much more important and interesting to HN:

> The tendency has become more and more prevalent under electronic auction systems.

While incidents like this are often used as examples of the failures of technocracy, I think the problem is elsewhere. A decision process has been out-sourced to a machine that is not smart enough yet to make a good decision. The decision might be good in a very localized and easy to measure sense (cheap), but lacks an understanding of other economical signals.


When I read this article, I felt a sharp pain, as it sounded as if it could be directed at our company [1]. Acquiring pricing for local governments through online reverse auctions is what we do.

Your observations are poignant. There is strong incentive to treat everything as a commodity these days. Part of our business is in helping buyers structure their RFQ documents so that the product or service can be competitively sourced. In the government sector, however, we rarely participate in this process.

Governments must purchase based on their state and local laws, which often ties their hands. A local school system may want to purchase better quality chalk for a reasonable premium, but the laws pertaining to procurement frequently prevent them. It's a really bizarre circumstance where public perception of government waste has resulted in laws that eliminate buyers' ability to cut deals that are better for everyone. I'm not saying it's impossible; it's just much easier for a procurement officer at a local government to follow the standard process (which prioritizes price) than it is to fight for quality.

There are pockets of innovation, however. Our eRA platform stands out because we offer the ability to incorporate non-price factors in to our ranking algorithm. This gives a buyer the ability to "weight" one bidder over the others. We developed some of our most advanced features jointly with local governments in Arizona. I'm not sure I should name names, but there are people doing really great work out in Arizona.

We have even considered founding a separate not-for-profit organization whose entire purpose would be to assist local governments in improving their procurement laws so that there would be a better balance of good sourcing practices and the ability to incentivize quality over price where appropriate. That's a huge challenge though. You frequently end up directly opposed to special interests with very deep pockets and a financial incentive to keep procurement laws just as they are.

1: If you're wondering: http://www.eauctionservices.com


If you instruct a machine to optimize for the cheapest price; it will do that. The failure is in the instructions given, not the introduction of machines.


> If you instruct a machine to optimize for the cheapest price; it will do that

if (x < y) { // } is a simple optimisation.

The classic "you optimise for what is or can be measured rather than what is important."

There are ways around this of course by setting minimum quality via an objective third party (MIL-STD's for example) but that process introduces a whole new can of worms.


That much should be clear, my (unfortunately) implicit question is: how can we instruct a machine to optimize for the very intangible things everybody seems to love so much about this chalk? What about the more subtle economic things (like signals of economic climate and value of quality).


I once worked for a guy once who claimed he could sole-source a peanut if he had to. Many of them boiled down to colluding with the supplier to find a set of properties that were unique to their particular peanuts; things like soil pH, average temperature of the area they were grown in, not too much rainfall, not too little rainfall. Most purchasers were neither educated sufficiently to question these kinds of specifications, nor were they particularly interested; they also wanted to find the path of least resistance through the process. We probably could have bought most anything we wanted. Later we had a DOE grant get audited and there was some nervousness on our parts about some of these shenanigans that might be found. The guy just chuckled and said nearly everyone has to play this game to some extent. His major complaint about our books was that our overhead costs were above average, and he helped us make a case to the Uni for lowering it. Didn't see that one coming.

For an item like chalk though, my employer now has an arrangement with one of the big office-supply companies who specialize in tolerating (and charging extra for) gov't nonsense. Lately it's been CDW-G, I think. They provide a walled garden of approved products (which don't include the kind of good quality chalk in question) and a semi-streamlined purchasing process. It works great for the purchasing dep't, I'm sure, but it sucks for end-users because it introduces the same kind of uncertainty that a purchasing organization is supposed to fix. For example, the prices quoted on the website are not the actual prices of the products, and the products have variable discounts; this and the latency inherent in dealing with the purchasing process makes budgeting and price-shopping difficult to impossible. In my very small dep't we usually have budget projections rounded to the nearest hundred even for small items like chalk/dry erase markers because doing otherwise is a waste of time. We spent a lot of time a few years ago, working around the purchasing system just trying to get decent chalk before fatigue eventually set in and we just gave up and installed whiteboards.


optimise by a score. score is calculated as a function of price and some quality metric. unfortunately this can get complex.


Regardless of what metric you choose, Arrow's Paradox will bite you. Single-objective optimization will always force you to ignore good alternatives.


I do not understand the reference to Arrow's Paradox in a discussion of procurement policy (almost certainly related to my lack of knowledge). Can you elaborate?


Sure --- there's a neat paper by Franssen (2006) that demonstrates the formal equivalence between optimization problems and the social policy problems Arrow was concerned with. Basically, Arrow says that some constituents will always lose out under any social policy. Franssen showed that you can swap out "composite cost metric" for social policy and "components of the cost metric" for constituents, and the same arguments apply.


It must be too early for me. I can not find the franssen citation. Title?


Here's the full citation. Turns out I misremembered the year (2005).

Franssen M (2005) Arrow’s theorem, multi-criteria decision problems and multi-attribute preferences in engineering design. Research in Engineering Design 16(1):42–56


Right, a specification for chalk is for chalk, uncolored, and or colored. There is no way to specify for quality.


Sure there is. Sadly you'll need humans for that part, though. You could get several humans who all get a sample of each supplyers chalk, and then ask them to rate the quality of each on one (or several) scales. The computer can then integrate this rating with the rest of its metrics.


I didn't mean to say that it wasn't possible, you probably don't need a subjective measure of quality. I think you could empirically discover some objective characteristics of good chalk that would do the trick. You could at least quickly reject the worst chalk which is too brittle and too hard. I meant to say that there was and is no way that is recognized sufficiently universally that enables one to specify a chalk with the desirable properties to a gov't purchasing agent such that s/he can buy the right stuff.


Specify bending strength, the chalk must not snap under specified force. Next define a writing pressure, angle and stroke speed and require fewer than x blank areas in the line. Require that the line width be within a specific tolerance, not too thick not too thin.

Maybe a spec like this doesn't exist now, but you could use the good chalk and write the spec based on how it did.


The trouble is that the places that sell chalk to universities don't rate chalk like this. Like I said in another post ITT, we derped around for a couple years before we just gave up and installed the damn whiteboards.


I'm totally unaware of conventions used in public purchases.

Isn't an "electornic auction system" just a database anyone can spam - the actual decisions are probably made by humans? My hunch (might be incorrect) that the auction system creates a barrier of disinterest where the official has very little else to go with than the prices quoted. In this case it's not that the machine is stupid - it's way worse, the system stops actively humans doing what humans are particularly good at - separating wheat from chaff based on experience and intuition - if the official will never use the chalks and has no idea what the impact of the product will be.


The problem is that there is no known alternative really. If you just the government official decide, it breeds corruption.

Or at least it used to. Perhaps with the modern IT systems we could create a solution that delivers the transparency to the process - the official would explain his/her choices publicly, and the public would have an opportunity to argue...


Sometimes you're required by law to go with the lowest valid bid (presumably to discourage kickbacks and self dealing)


Not really. It's to prevent silly mistakes.


At the risk of sounding argumentative: yes, really!

Many (most? all?) state and local governments have laws that generally require you to go with the lowest valid bid to an RFP.


i bet on the other side there are teachers complaining about the quality of chalk they end up getting. this is nothing but a case of the employee responsible for inputting the auction parameters being a lazy bastard and not talking to the people that used the product he is buying in great quantities.

it's all a matter of doing your job right. despite the fact that a machine will help you with some portion of it.


You are not quoting the part where he shows his benchmark results, analysis of the problem, and suggested solution. I don't necessarily agree with Linus' tone or choice of words, but what you are doing is far worse. I'm not even sure if you are trying to be sarcastic, but that does not even matter: You are not contributing anything either way.


I have made an observation from Linus comments which is sad specially to people who worked on KDbus all this time. If you think your comment helped anything maybe you should consider making them when you see comments like the next one:

quote: """ amelius 3 hours ago

Should be rewritten in Erlang or Go, if you'd ask me. """

otherwise you are full of sh*t. :)


Even if your tax is paid automatically you can still recover taxes but it is hard and the laws are confusing. In Germany a special profession with a state exam (Steuerberater) as well as special associations (Lohnsteuerhilfeverein) exist or have been created to help individuals. For self-employed or people interested in recovering part of their taxes using those services is almost unavoidable.


At first I thought "a build system which automatically figures out what to build and how to build it purely based on the source code" included figuring out library dependencies. It looks like I was wrong though and you still have to set linker and compiler flags yourself. I think actually understanding your dependencies (minimum versions, different build configurations and all) and handling them in a unified manner without too much special casing for the peculiarities. CMake already does this quite well, but there is still lots of room for improvement.


It would be conceptually easy to extend Ekam to support automatically deciding which libraries to link against, and this has always been something I intended to do eventually, but it just hasn't happened yet.

In Ekam's link stage, it starts with a root object (typically, something exporting the symbol `main`) and then tries to satisfy all missing symbols using other built objects, transitively, until none of the remaining missing symbols are satisfied by any known objects. It then hopes those remaining symbols will be covered by libraries specified by the `LIBS` environment variable and so invokes the linker.

Ekam could additionally at this point consult an index of symbols exported by libraries in order to decide which ones to link against. We just need to implement such an index and decide how exactly a library gets included in the index.

I haven't implemented this because with Sandstorm we prefer instead to bring most of the source code for our dependencies into our build tree, so that they are in fact all built with Ekam instead of linking with pre-built system libs. This makes sense for Sandstorm since we intend to run in a chroot environment anyway and so cannot really share libraries with the rest of the system regardless. It's also really nice for development to be able to edit dependencies directly when needed and start using those changes in Sandstorm code without having to install the new library version somewhere as an interim step.


It's something that someday will come with clang modules support: http://clang.llvm.org/docs/Modules.html#includes-as-imports

With proper caching it could make local build systems obsolete for many purposes. Maybe it would be easier then to delegate module building in style similar to distcc, but less fragile.


I'm not intending to put you, your enthusiasm, or the people behind Modules down: I've been hearing that story since C++11 was still called C++0x (this might be an exaggeration) but I don't see anything happening in terms of standardization. And as long as modules are not standardized they are not going to see widespread adoption in C++ and certainly not in the open source community.

Because I like to walk down the memory lane I've looked at my HN comments regarding modules. Here is one from exactly one year ago [1], 855 days ago [2], 1100 days ago.

[1]: https://news.ycombinator.com/item?id=7491149 [2]: https://news.ycombinator.com/item?id=4836499 [3]: https://news.ycombinator.com/item?id=3613636


    if( masterList[z].list2 != NULL && masterList[z].list2.length() > 0 )
    {
        for( Integer y = 0; y < masterList[z].list2.length(); y++ )
The if-statement is a good summary of what is wrong with Java. The author doesn't even notice that the second argument of && is redundant and keeps it in the "refactored" version as well...


> The author doesn't even notice that the second argument of && is redundant and keeps it in the "refactored" version as well..

That's false. A list object may be initialized but be empty (making it's length zero). Or it may not be initialized (making it null).

Both conditions may happen independently of one another. He checks for the null first so that checking the length does not throw a NPE.

> The if-statement is a good summary of what is wrong with Java.

Frankly, I see nothing wrong here.


He then proceeds to iterate over the list using zero based indexing. The loop would not execute once if the list had length zero, making this check just plain wrong.

I know an even lower level language (C++, C) that doesn't have the problem of things which make no sense to be NULL (list elements). The problem is that Java did away completely with value types and made everything pointer only. That has been recognized by later languages (C#) and fixed.

The whole code consists of problems: the first is the language, the second is the programmer, the third is the missing for-each construct.


My word, something can be redundant (not conceding that point) and not "just plain wrong."

Your language hate and insistence that the programmer himself is one of the "problems" is why people don't do code reviews, and why people get overly defensive if you offer constructive criticism of code. Your criticism is not constructive.

If you are in a position of power or mentorship I suggest you take a moment to think how your words and actions influence those around you, particularly those less experienced who may look up to you.


> My word, something can be redundant (not conceding that point) and not "just plain wrong."

I think we can agree to disagree. I'm strictly against redundancy if it doesn't serve a well-defined purpose.

> Your language hate and insistence that the programmer himself is one of the "problems" is why people don't do code reviews, and why people get overly defensive if you offer constructive criticism of code. Your criticism is not constructive.

I would phrase my criticism entirely different if the recipient was someone who asked for my commentary and not someone who felt confident enough to write a blogpost on how to start refactoring code.


> I'm strictly against redundancy if it doesn't serve a well-defined purpose.

Well, a good optimizing compiler might factor redundant checks out. The JVM has one of the best optimizing compilers around... Sometimes source code clarity is better than "absolute correctness", especially when we're discussing something trivial.


> I would phrase my criticism entirely different if the recipient was someone who asked for my commentary and not someone who felt confident enough to write a blogpost on how to start refactoring code.

Right, because we should seek out excuses to be nasty to others. How about we just try to be constructive as much as possible?


> I know an even lower level language (C++, C) that doesn't have the problem of things which make no sense to be NULL (list elements)

You must null check a lot of things in C, especially since there is no graceful error handling (try/catch blocks)... C certainly allows things to be null (or garbage) values.

> The problem is that Java did away completely with value types and made everything pointer only.

I'm not sure what you are saying here -- the very notion of pointers do not exist in Java. This decision was made while creating the language, and avoids an entire class of programming errors. Java is strictly pass by value.

> the third is the missing for-each construct.

I completely agree with you here. Java does have a for-each construct, and it's recommended to use whenever possible. It avoids an entire class of programming errors.


> I'm not sure what you are saying here -- the very notion of pointers do not exist in Java. This decision was made while creating the language, and avoids an entire class of programming errors. Java is strictly pass by value.

To clarify: the GP is probably referring to boxed and unboxed data types. IIRC, Java has some unboxed data types ("primitive" types?), but mostly everything is boxed behind a pointer.


> but mostly everything is boxed behind a pointer.

Behind a Reference would be more accurate. It's just a reference to a spot in the heap. Other than similarly "pointing" to a place in memory, the comparison between Java References and C Pointers stops there. One cannot pass a "pointer" in Java, nor can the pointer be free-form manipulated like in pointer-arithmetic.


My statement is perfectly accurate in the context, which is discussing the representation of Java types.

Java doesn't hold a monopoly on the word "pointer." For example, Go has pointers but doesn't allow pointer arithmetic in safe code. Similarly for Rust.


> You must null check a lot of things in C, especially since there is no graceful error handling (try/catch blocks)... C certainly allows things to be null (or garbage) values.

You are correct that adding C as an example language was wrong. C++ on the other hand still stands.

> I'm not sure what you are saying here -- the very notion of pointers do not exist in Java

Java references are just C pointers without pointer arithmetic.


The problem you highlighted is just redundant case analysis. That's not a Java specific problem.


No, you don't need the the second argument since the for loop just won't execute at all if the list length is zero.


You can't always rely on for-loop mechanics to do branching.

Also, having an implied "conditional" that must be extracted by reading the head-portion of a loop is very very unreadable. I'd reject your submission on a code-review.

Instead, you should do something like below. I assume that your example is contrived, so for the sake of argument, assume I have all sorts of business cruft around mine. I.e. should_process exists because there are business rules for processing/not processing the master_list.

    def should_process(master_list):
        return master_list is not None and len(master_list) > 0

    def main_doing_of_stuff_foo():
        if not should_process(masterList):
            return  # Yes, this thing should be in its own method so you can do an early exit.

        for item in masterList:  # Yes, you should be using an iterator as well, not indexing.
            print("Body goes here".)


I can't follow. The code I showed is already processing masterlist and decides if it should process masterlist[z].list2 for every z: 0..masterlist.length.

I fully agree that the code in my post is very bad, I took it from the article.


The main point is that you shouldn't rely on an implied rule that comes from the oddity of the length being 0 on a list.

If I'm looping through a list, it's because I want to loop through it and process its items. I'm going to return/skip earlier due to whatever reason, and not rely on the "logic" for processing being built into the length of the list, or whoever populates the list.


> The author doesn't even notice that the second argument of && is redundant

Doesn't it check if a list isn't null and then if the list has at least one item in it? Or are you simply saying the for loop takes care of the situation where there are 0 items in the list?


I actually don't think he should remove the second argument until the entire thing has been verified in tests. What if it was actually:

  masterList[a].list2.length()
..but he misread it as

  masterList[z].list2.length()
and so removed it..?


You are saying that you actually need to read and understand code you are refactoring. Hard to disagree.


This has been discussed here before in 2012 and the linked version is not different from the one before.

https://news.ycombinator.com/item?id=3871463


It was discussed before, it will be discussed again. We've presumably got new users and new insights in the last 3 years.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: