Hacker News new | past | comments | ask | show | jobs | submit | veebat's comments login

I'm willing to make a speculative bet on Brave/BAT. The existing user growth is good, and the incentives for their ad system are basically correct for everyone.

The product Brave is actually peddling is akin to anti-cheat software: It takes steps to prevent fraud throughout digital advertising and improve the cost effectiveness of digital ad campaigns. That pitch means that they have the ear of advertisers already, and it's only a question of executing on the rest. Given prodding, advertisers will cooperate in order to fix up the market because what they really want is a "commoditize your complements" scenario. They are not there to become adtech wizards. They want to find qualified customers for their primary business. If Brave succeeds in forming a coalition, you'll start to see businesses from other sectors talk it up and give users and content creators hard incentives to sign on. They can even use such marketing efforts as part of their own speculative plays on the currency.

On the user end of things, that means that Brave engages in the click fraud arms race, but with higher-powered weaponry than can be accessed from within Javascript. They get a deal that is a definite improvement on the status quo: they choose exactly when and where they want to see ads. For the other parts of the system, the cryptocurrency marketplace replaces most of the middlemen and associated incentives for fraud. If you're in the business of total control either way(whether it be "never-again-an-ad" or "I want to access as many users as possible"), it's a devil's bargain, but if you're interested in getting the Web on a more sustainable path where quality is encouraged, this is something that could do it.


I recommend checking out the material about Subtext[1] for elaborations on tabular programming. Often in practical systems an enumeration of finite state is a tool to "squash" very large decision tables into a more comprehensible pattern. But having the decision table aids in discovery of just how many states you have to deal with.

[1] http://www.subtext-lang.org/


Yup! There are many versions of Subtext. The most relevant to decision tables is perhaps schematic tables: https://vimeo.com/140738254

"a cross between decision tables and data flow graphs"


This is kind of analogous to photorealism versus cartooning. A history or biography would attempt to adapt events and experiences directly to the page: "this happened, and then this happened, and it felt like that". Fictional work transforms those same elements into exaggerated characters and scenarios. Both kinds of works require some creativity to build up a coherent story, but fiction has a freer hand to manipulate elements and choose its focus.

Just making it up without adapting from anything real usually isn't something that good fiction is built up from, though. That path goes directly to cliche, and it's easy to find cliched fiction.


Reading aloud is something I've returned to in later years as speaking practice, vocal warmup, and testing my own writing. It was, of course, a chore in school.


There's a profound wisdom to knowing the niche you're in, and Zig does seem to have that. The applications part of the stack is saturated with options, all trying to encroach on each other, but when it comes to the genuinely low-level, unsexy stuff...the viable options are all pretty old languages, and languages that patch over those old languages.

And with respect to discussion of boilerplate, you can, of course, always have a "Zig++" language that patches over Zig. By cutting out its own metaprogramming functionality, it's much more straightforward to generate useful Zig source. There are a lot of advantages to restraint in the design, and Go does share that, too.


I read all that and still think, "you don't what you're talking about".

This is 100% a problem of dependency management, i.e. a problem common to software everywhere at all times that has never gone away. Whether more work survives is ultimately a factor of whether it's whether it's easy to make it survive, not on the "blame them for lacking effort" angle. Some folks will take the extra time, some won't.

The software that still has its dependencies, we can do something about. For most games that equates to "emulate the release platform, and we get the playable release back" - which is the defacto standard, since it's relatively easy to get copies off a ROM or CD, and it's the single artifact that is most representative of creative intent. It absolutely makes sense that almost all the effort would go towards that. We like seeing Leonardo Da Vinci's plans and studies, but he was focused on results, and we mostly know him for those results.

The source is more specialized, and not as easy to subject to the emulator model. A lot of shops "back in the day" did not even have good backup or version control practices and the game's build system involved a whole custom software project of its own, and only one person knew how to make it go. Getting it even to the point where all the code and assets were accounted for would incur a person-weeks expense, and studios then as now would finish a release with layoffs to shave off salary expense, which in turn gave an incentive to jump ship before finishing to dodge the layoff, leaving certain knowledge about the project in limbo. And employees are often not trusted enough by the management to let them bring in an external drive and dump the project: the optimism of "preserve the art for the future" goes right up against the pessimism of "i'm building a business, not art", and the psychopathy of "if they can't dump the project, they can't steal it". (yes, it happens - on both ends) There are numerous barriers in those crucial weeks around the ship date that can cut off this aspect of archival right at the start, and many a preserved game is the result of some hero who went against the direct wishes of their bosses and made copies in secret.

Nowadays, project tooling tends to fall a little more along standardized lines, and if you start with the goal of minimizing dependencies, you can feasibly have a smooth path to restoring the build without incurring the same overheads that make managers give up on archival to save a dime. So while commercial conflicts of interest remain, it's gotten easier, relatively, if your project scope is not so large that you need to bring in the custom tooling again(which, of course, is what happens in AAA). It's not "sneeze and you've preserved history" easy, but we can work on that.

And what if we're talking about an online game? That's the preservation nightmare taking place today. All the MMOs, mobile games, and so on - those are living things. You can't reconstruct a userbase. So even having running code and assets, it gives you a part of the experience, but it will be empty, fossilized. You will have to fall back on video footage to see it in contemporary context. That said there is the occasional revival effort, such as with Habitat[1].

tldr; if you want better archival, support game labor, support projects that improve basic software infrastructure.

[1] https://en.wikipedia.org/wiki/Habitat_(video_game)#Revival


I am not sure where we disagree.

Everything you're talking about is a legitimate problem. I agree with you completely. However, source access helps with the majority of it -- even with making better emulators.

But even disregarding source code entirely, the game industry is also not particularly great at preserving final releases. If you're a big, popular title, sure, that'll probably be available. If you're a smaller title, or you saw a limited regional release, or you're a demo for an unreleased game, it's not a given in our industry that you actually do end up in vault somewhere. And that's the first step -- you can't run a game in an emulator unless you have the game to run.

The easiest question is, "do Sony, Microsoft, and Nintendo have an official archive where they preserve the release binaries of every single game that is released on any of their platforms?" The second easiest question is, "how about source code?" And then after that things start to get complicated and all of the concerns you're talking about come into focus.

In the film industry, there are cross-studio efforts to make sure that raw footage isn't lost. That's the result of prominent directors and studios taking the industry to task for many years, often after they had been personally bitten by film degradation. Nothing like that exists in the games industry. On the contrary, if you read over the comments on this article, you'll find multiple instances of devs arguing that preserving history is less important than shipping the next title, a kind of weird throwback to the same philosophy that led to old film studios destroying film reels from unsuccessful movies so they could be reused on the next project.

Of course, that's not a problem with Crash. It is, luckily, famous, so lots of copies exist. But apparently very few of the raw assets do. And it seems to me that if Sony can't even solve that problem, when Sony seems to have a clear financial incentive to solve it, they're not really in much of a position to start tackling MMO community preservation yet.


> And it seems to me that if Sony can't even solve that problem, when Sony seems to have a clear financial incentive to solve it, they're not really in much of a position to start tackling MMO community preservation yet.

There's a problem with that argument, though. Sony et al have an incentive _now_ to want the content for older games. But it wasn't until much later that the idea of rereleasing games, much less archiving bits of them for future use, was a common thought.

I'm not arguing that the industry has gotten particularly better at that in the interim, just that there wasn't an incentive for it historically, and concluding that they can't do it when incentivized to doso doesn't follow.


That is a very good point.


I think most of the dynamic/static bias depends on the specific experiences of your problem domain.

If you have a lot of types of data to model(layers of records, sequences, indirection, etc), dynamic types make it easy to start bodging something together, but result in write-only code. So you end up wanting to have additional structure and definition - maybe not immediately, but just after prototyping is done and you have a first working example.

If you're just applying a routine algorithm that ultimately does a round trip from SQL, you don't need that additional assurance. The database is already doing the important part of the work.

If you have simple data but it needs to go as fast as possible, you end up wanting to work at a low level, and then the machine size of the data becomes important - so you end up with a static types bias.


Another helpful comparison is Lisp syntax versus Forth syntax. If you actually want a simple, terse, unstructured language, Forth is the way to do it: Words will consume and return any number of inputs or outputs to the stack, and you get entire classes of error when it's unbalanced that would surface as syntax errors anywhere else.

Just adding s-expressions is actually a huge jump up in syntactic complexity and structure, since now you can define a whole hierarchy statically, "on the page", instead of having to use a series of stack operations to manipulate memory from afar. And then as you lucidly put it, going from s-expressions to a practical Lisp dialect adds another level of context on top of that.

(And yeah, it's possible to bolt structure onto a Forth dialect, too, but the "Chuck Moore way" is that you do that custom every time you need it, instead of trying to generalize. Generalizing a concatenative structure leads to something like Factor.)


That really speaks to the horrors of America, given that in Japan, as a white male foreigner with command of "hai", "arigato" and on good days, "ah, so desu ne", I definitely got the sense of being put up with - albeit more so in Tokyo than elsewhere.

But at the same time I can sympathize in that street interactions in California have grown openly hostile across the board. The level of polite hospitality is so low and paranoia so high that it is actually hard to go outside and do business without ending up in a confrontation of some kind. Just the other week I was crossing the street in SF after seeing a movie[1] and a lady(a white lady) crossing the other direction gave me the finger and screamed "fuck you, fuck you" for no reason I could fathom.

We passed, and then I had a feeling of deja vu and thought, "didn't that happen the last time I saw a movie here too? Does this woman just go around in the evening cursing out strangers for self-care?"

And that is the kind of thing that sounds dreadfully believable for SF these days. The privilege spares me from most avenues of material loss, but that's hardly something to be proud of.

[1] The film was Sorry to Bother You, if it matters. I enjoyed it


Early 3D games are primarily limited on how they handle occlusion. If your scene is designed favorably(e.g. wireframe spaceships), you can ignore occlusion, but that doesn't describe most interesting scenes.

Leaderboard is using a kind of painter's algorithm to do everything, which is flexible but requires using overdraw, and by itself, will produce rendering errors in many scenes because the depth will be sorted per object or per triangle, versus per pixel. Filled triangle rasterization is particularly expensive to compute and requires a fair amount of numeric precision, both of which compound the issue of using overdraw.

Fractalus does not use these techniques for terrain - although the exact implementation is idiosyncratic[1], it's ultimately filling an outline that defines the horizon, and then adds texture to the landscape with dot patterns. Bear in mind, it doesn't render to the full screen either, it's mostly HUD, while Leaderboard is a little more expansive.

[1] A similar technique is described for Captain Blood: http://bringerp.free.fr/RE/CaptainBlood/main.php5


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: