Hacker Newsnew | past | comments | ask | show | jobs | submit | bluejekyll's commentslogin

A discussion on licenses will go sideways very quickly. GPL does limit the adoption of software in certain environments. So it really depends on your goals. Do you want an OSS project that will be useable by everyone (including corporations) or do you want to guarantee that the software will always be OSS and guarantee that Corporations can’t benefit from it without contributing back (potentially requiring them to open their own proprietary code).

There’s a lot of moral perspective that people apply to this decision, but not all developers have the same goals for their software. MIT is more flexible in its use than GPL, but doesn’t help ensure that software remains open.


> MIT is more flexible in its use than GPL, but doesn’t help ensure that software remains open.

Sure it does. The original software will always remain open. It isn't like people can somehow take that away.


GPL is copy left, it has a stated goal of encouraging more software to be OSS, including new contributions. That’s what I meant by software remains open. MIT on the other hand can be used in closed source situations. While the original code will remain open, future changes are not required to be open source.

They can use it on locked devices where you cannot replace it though. And then what do you do with the source? Print it and appreciate its beauty?

I found the title for this post misleading. To clarify it a bit, AI has only improved productivity by 10% even though 93% of devs are using it.

Yeah, the title may suggest that productivity is still 10% out of 100% after CEOs fired half of developers believing that the rest will do all the job with the help of AI.

This is ultimately the thing that needs to be fixed. The exemption for small trucks was stupid, and it should have been reserved for literal farm equipment (as that was intended). The fact that SUVs slip by on this now has created such a dumb market.


The OBBB Act passed by Congress last year eliminated the financial penalties associated with violations of CAFE standards, so there’s presumably no reason why automakers have to abide by them anymore, except possibly for concerns about future legislation.


> I need that little bubble that separates me from other people.

I get the same independent feeling from others you describe while riding my bike (not a bubble, but that’s a false sense of security in a car giving the 40kish car occupants who die every year in the US). In fact, I generally enjoy that bike experience more than I ever do driving because I never get stuck in car traffic, never get stuck behind a line of cars at a traffic signal. Never need to work about parking, other than finding a secure place to lock up (which some destinations lack). I used to love driving, but I started commuting by bike for work and realized over time that I enjoy biking so much more that I go weeks at this point without ever driving.

FWIW I live in a smaller American city of about 120k people, but is part of a greater metro area.


I really like this advice, but aren’t these two examples the same, but yet different advice?

// Good? for walrus in walruses { walrus.frobnicate() }

Is essentially equivalent to

// BAD for walrus in walruses { frobnicate(walrus) }

And this is good,

// GOOD frobnicate_batch(walruses)

So should the first one really be something more like

// impl FrobicateAll for &[Walrus] walruses.frobicate_all()


I suspect that Mark Twain would even greatly appreciate the humor in referring to it that way.


He once said he would, in fact!

;)


Is Mark Twain still alive? I heard they exaggerated his death...


The American Heritage Dictionary is far better than Merriam-Webster in my experience.


Rust pretty much nails all of those.


Not really. By default allocators will panic if there isn't physical memory available. Recursive functions can cause panic at certain depth. Code generated by macros isn't very visible for the developers and recursive macros are very common. Return types are checked only if the developer adds #[must_use].

You can overcome lot if you invest a lot for type system, but that depends on the developer.


“The tea bags used for the research were made from the polymers nylon-6, polypropylene and cellulose.”

They aren’t pure plastic.


Those were three different bags, not all in one.


As I recall, BeOS was asking on the order of $80 million, NeXT was acquired for $400 million.

I found this reference, so 80 valuation, Be wanted upwards of 200, “In 1996, Apple Computer decided to abandon Copland, the project to rewrite and modernize the Macintosh operating system. BeOS had many of the features Apple sought, and around Christmas time they offered to buy Be for $120 million, later raising their bid to $200 million. However, despite estimates of Be's total worth at approximately $80 million,[citation needed] Gassée held out for $275 million, and Apple balked. In a surprise move, Apple went on to purchase NeXT, the company their former co-founder Steve Jobs had earlier left Apple to found, for $429 million, with the high price justified by Apple getting Jobs and his NeXT engineers in tow. NeXTSTEP was used as the basis for their new operating system, Mac OS X.”

https://en.m.wikipedia.org/wiki/Jean-Louis_Gass%C3%A9e


I don’t remember the exact number, but BeOS was too incomplete at the time to spend what they were asking, and maybe to purchase at all. There was no way to print documents, which still mattered a lot for a desktop OS in 1996. It needed a lot of work.

Now, in retrospect, Apple had time; Mac OS X wasn’t ready for the mainstream until 2003-2004.


To be fair, printing in 1995-6 was a huge can of worms and hell on earth.


Send PostScript, done. Today it's figured out what driver will properly rasterize exotic things like ligatures because we decided that throwing a real CPU in the printer was a mistake.


Unless you were using anything from that tiny obscure Hewlett Packard operation and didn’t want to shell out for a module. HP never promoted Postscript. It was far from universal as an embedded PDL.

> that throwing a real CPU in the printer was a mistake.

The CPU in any decently modern printer is still many times more powerful than what was in an original LaserWriter (30ppm and up needs power, even if it’s simple transforms and not running wankery). It’s not just about CPU power and modern laser printers still support PDL and vector languages like PCL and PDF (and many have some half assed often buggy PS “compatibility” eg BRScript), the bigger mistake is using general purpose Turing tarpit that is “powerful” rather than a true high level built for purpose PDL. PostScript just isn’t very good and was always a hack.

> Send PostScript, done.

The other problem of course being that raw PostScript as a target for most applications is not at all elegant and ironically too low level. So even if you wanted postscript, an OS that didn’t provide something more useful to most applications was missing core functionality. The jwz quote about regexes applies just as well.


> Unless you were using anything from that tiny obscure Hewlett Packard operation and didn’t want to shell out for a module. HP never promoted Postscript. It was far from universal as an embedded PDL.

it's why HP had a series of printers marketed explicitly for Macintosh use, whose difference from the otherwise same model was that PostScript interpreter module was included as standard, as Mac didn't really support non-postscript printers with anything resembling usability


There was a time when the fastest 68k processor Apple shipped was in the LaserWriter (12MHz instead of 8Mhz in the Mac).

I seem to recall a story of someone internal to Apple figuring out how to run a compiler or other batch processing system on the LaserWriter as a faster quasi-coprocessor attached to a Mac.


I remember that time. I was taking a graduate level intro to graphics class and we had an assignment to write a ray-tracer and turn in a printout of the final image along with a printout of the source code. The instructor allowed any programming language, so I used a different one for each assignment.

For the ray tracing assignment I used postscript, the PS image operator calls a function to return each sample in the image. The transform matrix made scaling the image easy.

My code was two pages long, up from one page because of the many comments. I think the next shortest was 15 pages. It also ran faster than most others because of the faster processor.


Don Lancaster (outside of Apple) did that. In fact, he ignored the Mac and connected a LaserWriter directly to his Apple II, and programmed in straight PostScript. Used that language the rest of his life. All the PDFs on his site were hand-crafted.


Oh I knew that was coming. This interesting but ancient piece of trivia just illustrates something about how slow micros were back then. It’s not like printer don’t have more and multiple CPUs today. Not like whatever poorly written outsourced to India “managed” shit and other features are going to run on a potato. Whatever is driving the color touch LCD on even the Walmart econoshit is many times more powerful then that 12 MHz 68k.

Still have no idea what the GPs point was. You can just as easily run a raster on the host, if it has bugs it has bugs, where it lives doesn’t matter.

Further rosetinting is of course that LaserWriter was $20k and it’d be a decade plus before a monochrome dropped under 1. I’m gonna guess the Canon with the shitty drivers is 10x cheaper and faster.


The amount of data transfer for 300x300 DPI full page images is high even now, most printers still render fonts and such internally.


It really isn't that much though. A 1200x1200 DPI monochrome image on Letter size (not even considering margins) paper is on the order of 16 MiB uncompressed. And bitmaps of text and line art compress down heavily (and you can use a bitmap atlas or prerendered bitmap font technique as well). It’s also usually easier to upgrade RAM in a printer than a crappy firmware.

> most printers still render fonts and such internally.

Many printers have some scalable font rendering capability, but it is often not usable in practice for high fidelity. You absolutely can raster on the host to either a bitmap font, or make use of the PDL's native compression. Most lower end printers (which is pretty much the bulk of what is sold) do not have the capability to render arbitrary TrueType fonts, for instance. A consumer/SOHO level Canon laser using UFRII is going to rely on the host for rastering arbitrary fonts.


I have a modern Canon laser printer that does not properly implement ligatures because of obscure driver issues. What I see on the screen is not what is printed.


Text layout is hard and unfortunately drivers and firmware are often buggy (and as printing is lower and lower margin that doesn’t get better). But just throwing a weird language engine in doesn’t actually solve any of those problems.


Text layout doesn't need to be done when the source is a PDF. Make printers do the PDF and let Adobe control trademark access via conformity tests and life is good.


The biggest errors I’ve found are when the PDF says it’s using FontX or whatever, the printer claims to support the same version, and it’s subtly different.

The PDF tool assumes it doesn’t have to send the full font, and the document garbles. Print as image sometimes gets around this.


> Text layout doesn't need to be done when the source is a PDF.

PDF isn’t entirely a panacea, since it’s complex enough that printing any random PDF isn’t trivial at all, but sure, close enough, but before you were talking about Postscript.

> Make printers do the PDF and let Adobe control trademark access via conformity tests and life is good.

PDF printers aren’t all that uncommon. So why doesn’t your Canon do this? These aren’t technical issues at all. This is an economic/financial problem as mentioned (doing business with Adobe!). This isn’t about part cost, a CPU 100x more powerful than the one in the LaserWriter is nothing.


96 was still a few years before all the printers natively supported postscript - HPs had their own vector graphics library for example.

Many printers in common use were still “one font wonders” and that resulted in lots of fun.


ISTR those of us using HP or Apple printers were generally in pretty good shape at the time. Can’t vouch for other brands.


Apple didn't support anything other than PostScript natively at the time, so their printers came with postscript support. HP made special models for use with Macs that shipped with PostScript included.


Apple sold a ton of printers without PostScript:

https://en.wikipedia.org/wiki/List_of_Apple_printers

The earliest ones pre-dated PostScript by years.


The high price was also justified by the success of WebObjects at the time, which was seen as a potential fuel for an IPO for NeXT, even though WebObjects was not what Apple was buying it for. This Computer History Museum article goes into that angle in detail: https://computerhistory.org/blog/next-steve-jobs-dot-com-ipo...


“BeOS did not have printing” was the insult thrown around at the time.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: