Hacker News new | past | comments | ask | show | jobs | submit | gpvos's favorites login

I always find the Fogg behavior model [0] helpful to analyze why I choose one behavior over the other.

  B = MAP

  Behavior = Motivation * Ability * Prompt
Take the little graph he shows, which plots ability/easiness along the X-axis, and motivation along the Y-axis. Put two of them side by side, and plot each decision according to how easy/hard it is for you, and how motivated you are.

Whichever plot is closer to the top-right is the one you’ll choose.

Try this by comparing:

1. Wake up to your alarm on your phone (P=1), a bunch of Instagram/Youtube notifications (M=.97), and it’s so easy to scroll you can do it with one finger (A=.98). This is at the top-right of the graph.

2. Get out of bed, find your running shoes (P=0), go for a run (M=.6, A=.5). This is in the middle of the graph but also invisible (no prompt).

So you pick laying in bed scrolling over going for a run every time.

You can flip this by, say, getting an alarm clock that’s not your phone and putting your phone in another room, and placing your bright green running shoes next to your bed.

A key observation here is that P is the most powerful lever in this equation, where often things like social media or junk food are always going to be near the top-right of the graph, so the primary means of attacking those is removing the prompt.

The next most effective lever to adjust is A (ability), placing barriers in your way of the undesirable thing, and removing barriers from the desired behavior.

The least consistent lever is M (motivation). It’s sometimes possible to increase or decrease our motivation, perhaps by journaling on the benefits of exercise and the drawbacks of social media, but that can be inconsistent or be a long-term effort to adjust intrinsic motivation. Motivation is also less consistent, a powerful source when you have it but not always there reliably.

[0] https://behaviormodel.org/


Yes.

They are definitely heading that way, I've been saying I expect them to for some time, and they are slowly proving my guess right.

It used to be that their cash cows were Windows (Desktop & Server), Office, SQL Server, and to a lesser extent Exchange (even earlier in time it was just DOS, Windows (desktop), and Office). Everything else, including Visual Studio despite charging a pile for it, was to funnel and capture people into using those, or to inconvenience startups in other areas that might later try to compete in one of those arenas.

Now their cash cows are Azure, Office subscriptions, and to a lesser extent SQL Server, Exchange, and advertising. They don't care what OS you use and what apps & services you run, but they want you to run it on Azure (unless you are a huge concern in which case the money from on-prem SQL and other licences are still worth talking about) and pay them subscriptions for storage & processing.

They can't just abandon Windows, that would look bad, but they don't want their own stuff to be keeping it alive any longer than needs be. If everyone shuffled over to Linux, Android, MacOS/iOS, etc. but still used online office and apps running in Azure, that would be ideal for them – the hassle of maintaining desktop Windows with all the hardware compatibility issues and such isn't something they would get into today if they were not already there.

Giving up on controlling their own browser engine was a big sign they were moving this way: let someone else deal with all that client UI gubbings, there is no practical MS-scale money in it, and concentrate on selling the subscription services to office users and devs. I think the failure of their attempt at mobile market-share was when this ball really got rolling internally: the thought at high levels in the business being “hang on, if we can walk away from that because it isn't worth the effort to keep pushing, could that be true of other end-user OS stuff too?”

It'll take time to move everything either properly cross-platform or at least browser-based, SQL Server was a big step in that direction, but that is relatively easy as it is effectively a micro-OS sitting atop something else, the likes of Visual Studio will be harder, but it will happen, and then Desktop Windows will be allowed to slowly die. Server Windows already is in cloud: devs are being pulled away from caring about the base OS to running everything in OS-agnostic functions and light container-based services instead.

This is why they don't care that people like me won't ever be buying into Windows 11 at all, even where we did hold our noses and let Windows 10 happen.


My issues with Git

- No rename support, it guesses

- no weave. Without going into a lot of detail, suppose someone adds N bytes on a branch and then that branch is merged. The N bytes are copied into the merge node (yeah, I know, git looks for that and dedups it but that is a slow bandaid on the problem).

- annotations are wrong, if I added the N bytes on the branch and you merged it, it will (unless this is somehow fixed now) show you as the author of the N bytes in the merge node.

- only one graph for the whole repository. This causes multiple problems: A) the GCA is the repository GCA, it can be miles away from the file GCA if there was a graph per file like BitKeeper has. B) Debugging is upside down, you start at the changeset and drill down. In BitKeeper, because there is a graph per file, let's say I had an assert() pop. You run bk revtool on that file, find the assert and look around to see what has changed before that assert. Hover over a line, it will show you the commit comments to the file and then the changeset. You find the likely line, double click on it, now you are looking at the changeset. We were a tiny company, we never hit the claimed 25 people, and we supported tons of users. This form of debugging was a huge, HUGE, part of why we could support so many people. C) commit comments are per changeset, not per file. We had a graphic check in tool that walked you through the list of files, showed you the diffs for that file and asked you to comment. When you got the the ChangeSet file, now it is asking you for what Git asks for comments but the diffs are all the file names followed by what you just wrote. It made people sort of uplevel their commit comments. We had big customers that insisted the engineers use that tool rather a command line that checked in everything with the same comment.

- submodules turned Git into CVS. Maybe that's been redone but the last time I looked at it, you couldn't do sideways pulls if you had submodules. BK got this MUCH closer to correct, the repository produced identical results to a mono repository if all the modules were present (and identical less whatever isn't populated in the sparse case). All with exactly the same semantics, same functionality mono or many repos.

- Performance. Git gets really slow in large repositories, we put a ton of work into that in BitKeeper and we were orders of magnitude faster for things like annotate.

In summary, Git isn't really a version control system and Linus has admitted it to me years ago. A version control system needs to faithfully record everything that happened, no more or less. Git doesn't record renames, it passes content across branches by value, not by reference. To me, it feels like a giant step backwards.

Here's another thing. We made a bk fast-export and a bk fast-import that are compatible with Git. You can have a tree in BK, have it updated constantly, and no matter where in the history you run bk fast-export, you will get the same repository. Our fast-export is idempotent. Git can't do that, it doesn't send the rename info because it doesn't record that. That means we have to make it up when doing a bk fast-import which means Git -> BK is not idempotent.

I don't expect to convince anyone of anything at this point, someone nudged, I tried. I don't read hackernews any more so don't expect me to defend what I said, I really don't care at this point. I'm happier away from tech, I just go fish on the ocean and don't think about this stuff.


I enjoyed your comment so much I've added it as a quote on my profile. Thank you!

https://metadat.at.hn/


this reminds me of that site linked here last year that explained and diagrammed the inner workings of a wristwatch. I used singlefile to save the page, as i do with any page that impresses me (or is noteworthy or newsworthy or needs to be saved as proof, etc)

> The only purpose of software is automation. There is ultimately no other goal. When your peers start talking about frameworks, easiness, a bunch of tools, or other bullshit realize they have absolutely no idea what they are doing.

This relates back to a course I took about software architecture. There's a guy called Johannes Siedersleben who separates software into three blood groups: A, T and 0.

According to his model, type A contains domain-specific business logic, type T technical code and type 0 is the glue. One should strive to keep these highly cohesive and especially loosely coupled.

I found this an interesting way to think about software and it made me realize that a majority of projects contain mostly type A code while most stuff university teaches is how to write type T code.


Not the parent, but I'm also frequently saddened by loss of wildlife and my "point" is definitely not that it would be better if humans never proliferated in the first place. I'm almost certain you can understand this sentiment. Most people have had to leave a home, break up with a significant other, change careers, in such a way that it ultimately worked out for the better and it was necessary to achieve some goal, but the loss at the time still brings about grief.

I'm not going to say this is an exactly analogous situation. One, I don't know that is really is necessary to human civilization thriving that we have to rapidly and recklessly destroy natural habitats the world over to make way for farms or just plunder crops, pelts, and whale oil and what not for a few decades to make a single generation rich until it's all gone. There is very likely a more sustainable way to do it that isn't as disruptive to the pre-existing ecosystems and still allows us to feed and house a large number of people. This isn't purely a matter of hey, they needed to go to make way for us. Think of blue whales or bison in North America. Populations absolutely decimated, damn near reduced to nothing until legislation and interional treaties finally agreed to protect them. But all that ocean and all that grassland is still there. We didn't displace them to use it for something else. We just killed them damn near for nothing, because blubber and fur could be traded for a lot of money for a few decades.

Second, it remains to be seen whether this is really for the better even for humans. All of this "success" you're talking about is the population exploding over the course of the last 500 years. The reasons for this are largely positive reasons. Indoor plumbing, germ theory of disease, global transportation networks, figuring out how to rapidly move food and water, not only for ourselves but for crops. Drastically increased land yield. All great stuff. But we're talking about a few centuries of success here. Sharks have been apex predators for 200 million years. Whether humanity is truly going to thrive over the long run or be a flash in the pan that disappears and leaves a mass extinction in its wake is yet to be determined.

But still, even if it ends up ultimately turning out for the best, it is still perfectly possible and reasonable to feel grief about the loss virtually all other large animals with rich inner lives and social relations have had to endure while it happened. Have you ever read about or watched many US Civil War material? I don't know that it's totally unique or even atypical for wars, but it's always poignant to me in the way that the two sides largely didn't hate each other. Many of the victors took little joy in the victory. The land they plundered and people they killed were their own country and their own countrymen. I can't say it was a bad thing as it ultimately ended one of the most evil things humans have ever done to other humans, but it was still a sad thing.


The only person whose books were burned by the Nazis, the Soviets, and the Americans!

Not much by way of past threads, but he sometimes pops up in HN comments: https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que....

There was a documentary not too long ago: https://vimeo.com/ondemand/wr1897. Has anyone seen it?

Kate Bush's lovely "Cloudbusting" is about Reich, based on the book his son wrote about driving around the Maine countryside with WR and his cloudbusting machine. Donald Sutherland plays Reich in the video, and Kate the son. The book is seen sticking out of her pocket in one frame. https://www.youtube.com/watch?v=pllRW9wETzw, https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

"I still dream of Orgonon..."

The woman who preserved Reich's estate for 60 years, Mary Boyd Higgins, was remarkable in her own right: https://www.nytimes.com/2019/01/23/obituaries/mary-boyd-higg... (https://web.archive.org/web/20190124065725/https://www.nytim...).


Apparently, the x86 version of Flash was extremely well-optimized and as a result very spaghettified. So even porting to 64 bits took Adobe years of effort.

Additionally, the SWF format itself was way ahead of SVG. It still is to some extent: https://open-flash.github.io/mirrors/swf-spec-19.pdf - look at the part that describes shapes, and compare it to SVG.

One thing that helped was ability to have shapes with fill on each side of the edge, allowing smoothly joined scenes that can be animated at real-time. All without the use of high-precision math, Flash used integers only with some fixed-point data!


These questions were explored somewhat in the Scythe trilogy[0]. Natural death was conquered so some people are tasked with going around and enforcing unnatural death. I enjoyed reading it and thinking over some of the implications.

[0] https://www.goodreads.com/book/show/28954189-scythe


Some thoughts on this, from an enthusiast, collector, and amateur player of odd and arcane keyboard instruments.

On the subject of one string per hammer: while this is novel for a modern acoustic piano, it is not especially so for other members of the family. Smaller harpsichords, for instance, are always strung like this, and even in larger ones having multiple "choirs" of strings, the number of strings being plucked simultaneously by a single key can always be configured, typically through levers or pedals.

In the 1970s Yamaha brought out line of single-strung stage pianos (CP-70, CP-80, etc.) that were popular with rock players because they were relatively portable and significantly faster to tune. (Kawai also had an upright of similar design.) These were not loud enough to be played acoustically, but instead were fitted with electric-guitar-style pickups under each string much like the Una Corda has, producing a similar sound. In fact, as I listened to the soundtrack on the video, I was struck by the resemblance of the sound not only to the Yamaha CP-70, but to the Rhodes electric piano, which has rigid steel tines instead of strings, but also one to a key.

I like the ability to swap in various felts to modify the timbre; many modern harpsichords have this facility also, with something called a "buff stop." Player pianos in the first half of the 20th century frequently featured a lever that would lower a comb of felt with little metal rivets between the hammers and strings, producing a "honky tonk" or "tack piano" sound; presumably one could fashion something similar for the Una Corda.

The open, vertical design reminds me of the beautiful Clavicytheriums produced by the American harpsichord builder Steven Sørli (http://www.lautenwerk.com). Here's a video of one: https://www.youtube.com/playlist?list=PL89758D803857A19B

Yes, these are quiet instruments, but the late-Renaissance and Baroque music one would typically perform on them is much better served by this sparse clarity than can be produced on the modern grand piano. I suspect that the Una Corda would be similarly friendly to this repertoire.

Another benefit of the single-stringing is that, presumably, a capable player might reasonably expect to tune the instrument themselves, permitting the setting of very specific configurations other than twelve exactly equal divisions of the octave tuned to A=440. For acoustic and historical reasons, a lot of pieces really bloom in a certain way when you can do this, but on the modern grand piano it's a time-consuming task usually best left to a professional tuner/technician. (Because of the relatively unstable nature of their instruments, harpsichord players, like harpists and guitarists, have to learn to do this themselves early on in their studies, and thereby gain exposure to various temperaments and reference pitches, e.g. Werckmeister III at A=415.)

On the matter of price, 22,000 Euro (~$25K) is about the going rate for a new custom-built harpsichord, so that's not totally unreasonable. Other classical and orchestral instruments of professional quality frequently command similar sums, and new grand pianos easily get up into the six-figure range.

I would love to see one of these in person. Anyone spotted one in the US yet?


Bach owned two lute-harpsichords (lautenwerken) at the time of his death. It is a rare instrument but there are recordings that are worth hearing.

https://www.youtube.com/watch?v=-EmXzQsiMu8

https://en.wikipedia.org/wiki/Lautenwerck


This writeup has a lot of good starting points: https://privsec.dev/posts/android/android-tips/

I want to share my own reactions to the name change since this is a really interesting topic. For context, I'm an African American, so many of my ancestors were slaves.

  - The first time it occurred to me that "master" in this context could offend anyone was when GitHub changed the name (and broke my workflow).
  - My immediate reaction was, "this change is by white people for white people," where "white" means anyone who isn't black.
  - My next reaction was, "they may be changing the name for the wrong reasons, but the change is brilliant."
Let me explain a little more. Whether motivated purely by virtue signaling or by more genuine intentions, changing the name doesn't fix any of the problems that black people face. The article explains this well.

What's powerful about this name change is that it pushes us to alter a habit, in my case one embedded deeply in my fingers, something that I do every day without realizing that I'm doing it. Thus it is a useful reminder of the implicit bias that contributes to the lack of diversity in tech. Never mind that the old name was harmless, the change brings repeated awareness to an important topic, and it reaches a the developer community in a targeted way.

So, next time you are annoyed that you have to fix a script or you accidentally type master when you needed to type main, please just take a deep breath, change the name, and remember to reflect upon whether you have are subconscious habits or biases that work against diversity in tech.


Interesting point, but a better post about compilers exploiting this type of ub is ryg’s:

https://gist.github.com/rygorous/e0f055bfb74e3d5f0af20690759...


Future systems will be like human genome: a small letter change will lead to the system self destroying after 20 days of flawless operation.

Note to self: Starting immediately, all raganwald projects will have a “Is it any good?” section in the readme, and the answer shall be “yes."

Not sure I would call it fun. Combine this with rubik’s cube solver robot, AI, smart contracts, AR/VR, neuralink and this could be daily life for everyone in the year 2030.

Rubik’s cube solver robot: https://m.youtube.com/watch?v=8ZBP0n6yeQo


That's why I setup my Linux install to work like a live-CD, with a two layer filesystem: a read-only base, and a read-write overlay that lives in the RAM. The files that I know I want to keep are bound from a read-write partition on the disk to the RAM filesystem, and all the rest gets deleted every time I shutdown my PC.

A lot of pieces of software non-maliciously keep records of everything you do with them through logs or caches that aren't straightforward to delete and it's the only way I found to have control over it.


This article may be unnecessarily alarmist: https://twitter.com/BallouxFrancois/status/13240857614493040...

> (that may be a bit too cynical)

"You actually don't need to be open-minded about Oracle, you are wasting the openness of your mind [...] As you know people, as you learn about things, you realize that these generalizations we have are, virtually to a generalization, false. Well, except for this one, as it turns out. What you think of Oracle, is even truer than you think it is. There has been no entity in human history with less complexity or nuance to it than Oracle."

- Bryan Cantrill, https://youtu.be/-zRN7XLCRhc?t=2046



Maybe it is using the same effect as “Hard-to-Read Fonts Promote Better Recall”

https://hbr.org/2012/03/hard-to-read-fonts-promote-better-re...


I would posit that:

    impact = charisma * funding
so .1 charisma score of some faceless tech exec * 1B in VC funding goes alot more than 2.0 charisma score * 50k of grasroots funding..

Privacy is usually not about things you don't want anyone to know. It's about things you don't want everyone to know.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: