Hacker Newsnew | past | comments | ask | show | jobs | submit | one-man-bucket's commentslogin

a (so far) in-house built service that manages DNS srv records.


  Less than three? Control-C, Control-V.

  Three or more? Refactor!


You seem to have an integer overflow error or something.

> Coins in circulation

> 10994200

> that's 224,516,226.80 kr

That's not right, $2,049,318,880.00 ~ 13,192,490,290.00 SEK


I don't get what's supposed to happen. To me it looks like this: http://i.imgur.com/CMwdLNg.png


This is what it looks like for me: http://i.imgur.com/ppMYz6c.png


Same for me


+1 same for me,

(to me it looks like some kind of particle accelerator experiment happening in your browser, where ก็็็็็ emits some kind of unknown radiation. After closer examination [zoom to +300%]: maybe it just shows the escaping life spirits of the toppled latin small letter «u» after being shot in right side.)


Chrome on Windows 7 64-bit:

http://imgur.com/3u1M2IG


It seems to render right on your browser. This is what I see: http://i.imgur.com/gJ1xBWn.png


It's actually more correct on your browser. It's a stack of diacritics. The rasterizer above was printing them on top of each other, which is (well, seems likely to be) typographically incorrect. The problem of course is that the implemented rules should disallow unused-in-the-real-world combinations of diacritics but don't.

But it's not a "rendering" bug.


About the same for me : http://i.imgur.com/fOTopJU.png

Both Opera and Chrome on Windows XP. On windows 7 I get the interestingly looking results with both Opera and Chrome.


I have tested with Firefox (latest version) on Windows XP and 7, and I think we can conclude the problem comes from Windows XP, as it shows the same thing as you, whereas it gives a whole big stack on Windows 7.


What browser and platform are you using?


Works fine for me. Safari 6, OS X 10.8.


Works in Chrome 25, 10.6.8


I'm pretty sure it's a Windows GDI font rendering problem. Windows DirectWrite, Mac's CoreText and other platforms seem to do okay.


It seems to work on Firefox x64 19.0 from Gentoo eBuild.


With Opera 12.14, I saw the same thing. I had to open the page in Chrome to see the weirdness.


chrome 25 and xubuntu 12.10


It seems to work on most linux distro's. I'm using Chrome 25 and Windows 7 and It's buggy.


http://i.imgur.com/5ov1qdf.png

This is an interesting result. Even more interesting is the extra 2000 results that Google throws in my direction.


I actually get a different rendering for some reason. I get all the extra diatricts to the right above empty circles. That certainly explains why I didn't quite understand the problem.


Looks like it's only a problem on Windows (and maybe Linux?). OS X is fine, Chrome and Safari.


Same here, looks fine for me in Chrome and Firefox on Ubuntu


I'm not trying to flamebait, but are macs so popular that it's just assumed that all new hires want one as their tool? At my company we're still asking new employees which platform they want to work on, is this falling out of fashion?


Everyone technical I see in silicon valley, with maybe 5 exceptions, is on Mac laptops. A fair number of serious devs have Linux workstations to go along with their Mac laptops.

People who do some other stuff (heavy email users, some video/audio people) do sometimes use Windows (it's weird, but I think the Windows audio stuff is better for realtime now than Mac).

(the exceptions are people with FreeBSD, NetBSD, and Linux laptops)

I'd be pretty comfortable as a Silicon Valley employer only supporting Macs for office automation, and then either Mac or Linux for development workstations. If someone really wants Windows, s/he can support it independently.

The harder problem is phones -- there are people who are religiously attached to iOS and to Android, and you basically need to support both. There are pretty good MDM tools to cover both at the same time, but it does mean you can't push enterprise apps unless you do crossplatform development.


At my company, small cms company bought a few years ago by older larger media company, recruiting in Stockholm, we're offering new employees a Mac or Linux laptop, but there is a lot of hinting that "you'd probably be best off picking a Mac", and a few new hires have opted for the latter because of reasons that to me sound like they think it's the "company DNA" or something.

I'm not religious about platform choice, but I'd hate to see a future where developers are pidgeonholed with regards to their tools

Edit: currently 13 of 15 of our devs use Mac


Yeah -- by OA I mean machines for handling email, spreadsheets, etc. Cross-platforming documents kind of sucks still.

A reasonable compromise is Mac laptop for office tasks, and a VM for development, and then a desktop development machine. If someone is super mobile as a developer, I could see a Linux laptop as an option.

The annoying thing is that if you really want security, you are basically stuck with Windows 7 or Windows 8 now, at least for desktop/laptops, and iOS or BB for phones. (Windows and platform management has gotten better -- OS X is actually the least secure OS in a major corporate environment today, due to lack of security and management tools. It's still decent for unmanaged use vs. Windows or Linux.)

In an environment processing highly sensitive information (say, a law firm working on M&As, or a print shop handling annual reports), where the tools aren't that essential to work, you could have a legitimate "you must use only our locked down systems" argument. I wouldn't really want to work in a place like that, though.

The long-term solution in high security environments is probably a mobile-based OS for desktop/tablet/mobile use, and then virtual desktop into either a super locked down existing desktop OS, or some new environment. For a lot of stuff, locked-down tablet/mobile (or ChromeOS) connecting to SaaS apps could probably do it.


I use a mac too, but it's running Linux. Do you mean OS X, or the machine?


Seriously. I find Apple's UI to be really frustrating to use. And honestly, I can't go back after switching to XMonad.

I have many happy coworkers who use Macs, but please don't make me use one.


Likewise. In fact, I like the Windows (7) UI better. I'm a full time desktop Linux user, but that's not for the desktop. I'd be happy with Windows as UI on top for "desktop stuff".

Note: I've been using macs daily, for hours from OS 6 to OSX 10.3, before I started using Windows and Linux more.



A X11 window manager can only manage X11 apps.


There's nothing keeping you from using the same X11 apps on OS X as you do on Linux.


Fair enough. I was super into XMonad when I used Linux, but never really found the need on Mac OS. I remember reading up on this topic back then, maybe that's why I didn't make the switch.


Those instructions are rather outdated; they don't work well with Lion or later.


We don't assume anything. New employees here at GitHub can choose whichever platform they prefer—it just so happens that most people here go for Macs of their own accord :)


Out of curiosity, how many developers do you have who primarily use a Windows machine?


One. The rest of the Windows team virtualizes on a Mac.


I'm guessing the one is Phil Haack


Nope, Markus Olsson.


It's not really about Macs per se, it's about the Unix toolchain and unlike Linux, Macs have Photoshop, etc.


I think you hit the spot. The reason why so many devs go for mac nowadays, is to have a good laptop that you don't have to babysit / configure much yourself and that's well integrated with a unix toolchain. On a desktop, many of these advantages go away when you compare to a linux workstation, therefore many go for that.


I had to babysit OS X a lot more than I had to babysit Linux. Ubuntu might take one or two tweaks when installing on a MacBook, but twenty minutes later it runs perfectly for ever (or six month, whichever comes first).

I had to babysit OS X all the time, due to its lack of a good package manager. This was years ago, I don't know if it has improved, but it grated me greatly.


> This was years ago, I don't know if it has improved, but it grated me greatly.

Well of course, OS X has heavily improved since the early days (I'd call 'early' everything below 10.4). The surge of developers wanting to use OS X has also increased the demand for package managers, and so they came. First fink, then macports and nowadays most use homebrew. You install them once using the standard OS X pkg installation facility and then you're good to go - the range of 'backend software' packages in brew is comparable to apt-get I'd say. For everything from linux that needs a GUI I have a VM, for everything Windows-only I have another VM. There's never even a question whether I can run something locally - once you have that it's hard to give it up again really.

That being said, the best OS X release is probably 10.6, since then I don't like the direction very much - but on macs you're basically forced to use the newest OS (XCode compatibility, hardware compatibility once you upgrade your macbook). The iOSification hasn't been a dealbreaker to me so far, it's IMO still a better all round experience than any other notebook, especially considering the service quality, which is bar none where I live. Friend of mine bought a $2.5k lenovo - the board went dead after 1 month, they picked it up and he hasn't seen it since the last six weeks, no replacement. I had a similar issue with my rMBP - got it back, fixed, after 2 days. Similar stories about Dell. HP might be better, but their hardware is crap IMO. I just can't trust any other laptop manufacturer at the moment, which makes me sad.


I somewhat disagree with homebrew being thrown in with Fink and Macports. The benefit to homebrew is that it doesn't require having package maintainers to babysit packages and make sure the binaries work and are updated. You basically just take the raw source and someone throws a patch on it. It's actually a thousand times better because you don't have to worry about compiling binaries that work on everyone else's systems no matter what crazy config they have.


Well I didn't say it's the same kind of architecture. For the end user it serves the same purpose however.


What I was trying to get at is that those systems were unsustainable and I can see homebrew developing into a legitimate "default" package manager that is infinitely maintainable rather than Fink/Macports which were simply shims that were doomed to fail by their architecture.


Agreed, which is why when it comes to testing a legit setup, I usually do it on a virtual machine. Or of course one of those $5 Digital Ocean VPSes.


My main gripe with Macs supposed 'toolchain' is that clones are nowhere comparable to 'apt-get'.


Well for one, you probably don't want them to be apt-get in terms of installing binaries. We had that with Fink, etc. and it sucked and was outdated quickly. Homebrew is pretty solid though, probably better than what Apple would come up with. They actually helped make what you would like a reality partially by releasing the Command Line Tools so you don't have to bother with Xcode.


Why not? Binaries are so much better for a platform like the Mac (basically just 32 bit and 64 bit Intel architectures at this point, with fairly homogenous OS versions across the installed base). Why on earth make everybody waste their local machine cycles compiling something that should end up the same for everyone anyway?

Fink was only out of date because they weren't keeping it up-to-date (perhaps their builds weren't automated? That's pure speculation, I have no idea). Debian manages to keep things more or less up-to-date with more packages (and architectures) than Fink ever had.

Incidentally, if you put your Homebrew in /usr/local then it will often install using the "pour" technique which is pure binary distribution.

I loved Fink until it started languishing, and I love homebrew now: its "everything in git" philosophy and very-open-to-pull-request attitude of the maintainer make me optimistic that it won't slow down and become irrelevant like Fink did. I think that's the real difference between the projects.


We offer new employees Apple hardware, on which they are free to use OSX or Linux. Apple hardware is the best hardware out there, so that's a no-brainer. We avoid Windows for security and compatibility reasons.


Yes, sadly. At my interview they said I could have a Linux machine, and I believed them. I got a freaking Mac. It's assumed that everybody secretly yearns for a Macintosh.


When I quit my first job, my former employer made a point of conducting all exit interviews with someone who was in no way an authority figure for the person quitting (I did mine with the receptionist).

I was of course very polite, I didn't want to burn any bridges and could still consider going back there some day, but I also was honest when telling them that main reason (that was in their control) for me quitting was that there was no development process (i.e. scrum) and the first thing on their to-do list should be to hire a good head of development.

They did this a year after I left and from what I've heard they have kept the rest of my team and even attracted some new talent. Maybe there are cultural factors at play as well (I'm in Stockholm).

They invited me back later for beers and starcraft 2, so I guess they don't hate me too much :)


I couldn't stand Unity either, and for me it was more an issue of coming from an earlier version of Ubuntu (gnome).


The €210 price tag is a bit too steep for me. I would have bought one for €60.


Don't laugh, I read the shipping fees as the total price. I thought it was adequate.


One of the main reasons I hang out on programming help channels on IRC is to hone my skills in reading other peoples' messy code when helping them solve a problem. Plus, there's the bonus good feeling from helping people :)


Planet Money did an episode on why TOS and EULA's aren't human (as opposed to lawyer) readable.

Here's a link to the episode: http://www.npr.org/blogs/money/2011/04/21/134633336/why-are-...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: