Hacker News new | past | comments | ask | show | jobs | submit | brzezmac's comments login

Two employees of a company were suddenly approached by the CEO accompanied by the CTO.

- Death or Scrum? - asked the CEO

The employees knew nothing about this Scrum thing, but were to intimidated to ask and the other choice was one they were not ready to make.

The first one thus replied in a quavering voice: "Scrum"

In this moment the first employee was grabbed by the CTO and was put through horrors not many could withstand:

- Neverending sprint planning meetings

- Daily Scrums that lasted 4 hours

- The sprint reviews

- Sprint retrospectives

- The backlog refinements

After all this the employee was only able to say: "Still working on User Story XYZ. No impediments."

The CEO then asked the second employee what their answer was. The employee was fighting a tough fight in their head.

"I hate meetings, I would love to do some real work, but I'm not ready to die yet. On the other hand, I can't go through life after all that abuse; There's no way I could live with myself". So he answered: DEATH!

To this the CEO swiftly replied: Death ... by Scrum


Scorch is the ultimate gaming experience. In primary school we would always finish computer science classes with a game (or a few) of Scorch. Recently I've shown this game to my kids (10 and 8 years old) and they were really excited to play it. They don't make games like they used to anymore.


Looks like a good place to workout before attempting to switch off notifications in some popular applications.


Certain cookie opt-out dialogs are a good training ground as well.


So we have come full circle. That's the way we have been building software when I was starting with professional software development (this was around 2005). But the idea of doing a thorough analysis (business, technical) is a little bit older. The only people to whom this might come as a surprise are the ones who've drank too much of the "agile Kool-aid". I'm not here to bash agile and start this flame war all over again (agile has it's merits ...), but somehow thinking before doing got a bad rap recently (for reasons beyond me).

It's like software development got trapped inside of King Julian's (from Madagascar movies/series) brain with his modus operandi: "let's start doing this before we figure out it does not make any sense".


The phrase I hang on to is something thrown out by someone I used to work with - "Agile isn't an excuse not to do things". Too many organisation use "we're doing Agile" as an excuse not to do any design thinking and instead "just code".

What Agile actually is (in the most general terms) is a different order of doing things that increases the amount of knowledge you have when you do it, and reduces the chances of wasting your work. In my example, you still do the design work, but rather than doing it for the whole system at once you do it for the piece you're about to do, and then implement that piece[1].

But yes, it's nice that the internet has discovered Waterfall!

[1] Although many systems, even in an agile world, do require a degree of up front "whole system" architecture and/or design thinking.


Because thinking clearly is the hard part. I use Github copilot every day for work (and it generates decent code completion for me) and the only thing I've realised is that the major work in programming is to think clearly, and converting thoughts to syntax is not really that hard. Copilot takes off that easier cognitive workload and helps me focus and clear my thoughts. Also, does anyone know how to think clearly or does that just come with practice and experience?


Thinking, especially abstract thinking, is crucial to software development, as often (not always) creating software is abstracting real life into algorithms and data structures. Writing text in a word procesor, notepad, etc. is in my opinion first step to validation of the idea, to having at least faint idea about the complexity of the solution we would like to build.

On the question of "how to think clearly" - I'd say it's pretty individual - some people start "from the bottom", some "from the top" and others "in the middle". Experience is to know which is which and which approach suits you best.


Wise words. That's prolly why you need to be good at math to be good at tech, abstract thinking all the way


I think you can sum it up this way: be planning heavy where it makes sense, and agile where it makes sense. There is not true one way. I'd not want to use agile to develop the formula language in a spreadsheet, and I'd not want to waterfall a marketing website for a product that is speculative. Common sense applies.


Agile and Documentation-Driven-Development can not only easily go hand in hand, I think agile is perfectly suited to work like this.

We figure out what the software is supposed to do -> We write documentation describing that -> We build the implementation -> New Requirements -> Figure out how to incorporate them into the plan -> Update the documentation -> Build the changes.

It's an iterative process, perfectly suited to agile development.


I would assume that an agile process will rely on Markdown, but then it some point does it get (irreversibly) committed to a more conventional document format.


If assume (please correct me if I'm wrong) by "conventional document format" you mean ones that do not play well with version control software like git (eg. because they may be/contain huge binary blobs), yes?

If so, at least in the projects I am involved in, all documentation that is checked into the repos remains in markdown (asciidoc is also used alot), and is only converted to non-plaintext formats for the purposes of release.

This is similar to how we build executables from our source code, but don't check in the resulting binary files into the repo.


Yes, by "conventional document format" I meant something in the Microsoft crapiverse.

Keeping it in markdown/asciidoc sounds great if you can make it work. Is your work in technical environments ? Where people have no expectation of using MS Word ? And those users that need fancy features like footnotes can learn how to use them and then retain that knowledge and use the features without inadvertantly document damage ?

I ask because I worked TC in a software firm targeting Microsoft platforms, so they relied on Word (and Sharepoint), and were resistant to ideas of XML and structured documentation, and Markdown never even raised its head.


>somehow thinking before doing got a bad rap recently (for reasons beyond me).

I've seen plenty of software devs engage in detailed forward planning for a future that would be utterly different from what they expected.

Best case this rendered all that forward planning moot and a waste of time. Medium case is they did a lot of pointless work. Worst case they technologically straitjacketed themselves and dug multiple holes they couldnt extricate themselves from easily.

Then they'd pick themselves up and do it all over again thinking that if they (or more usually somebody else) just managed to predict the future better then this kind of shit wouldnt happen. E.g. those idiot PMs just need to provide better requirements.

It's a hard rut to get out of because in so many other spheres of life forward upfront planning is critical and the future IS predictable. Software intuitively feels like it should be too. But it's the exact fucking opposite of that.

This is, at least, why certain kinds of thinking before doing got a bad rap with me.


Sounds like a problem with the execution instead of the technique.

The alternative to thinking before doing is wandering aimlessly, which really isn't likely to steer you where you want to be.

We basically developed whole-ass treatises (and some unhealthy cargo cults) around people saying that the observe -> think -> do -> restart loop should be small and leave space for adjustments between each step.


> The alternative to thinking before doing is wandering aimlessly, which really isn't likely to steer you where you want to be.

When I was a kid, we topped over this hill on the interstate, and way down in the distance there was an overpass across the road, with straight road from here to there. And I wondered how my dad could aim the car so well that it would go under that overpass way off in the distance.

Of course, now I know that he didn't do that. He didn't even try to do that. Instead, he steered the car.

The alternative to thinking before doing is not wandering aimlessly. It's steering. It's knowing where you're trying to go, even if you don't exactly know how to get there, and having an initial idea of how to get there, and then starting to go there, and adjusting as you find obstacles that you didn't know existed, and as you find that your aim was off.


Recommend revisiting Mr. Stoll's book - 'Silicon snake oil' (1995). It shows how much people who are inside the revolution can't see that it's the revolution. I think he got it wrong with everything the internet would become.

Mr. Stoll wrote something along the lines: "Some people who are offline feel that they are cut off from some very important aspect of the present day. However, only in some ways everyday life requires either computers or access to digital networks. They are irrelevant to cooking, driving, receiving guests, talking, eating, walking, dancing and gossiping. There is no need for a computer to bake bread, play football, sew a bedspread, build a fence, recite a poem or say a prayer."


"grab her waist, pull her close, lift her up, ..."


I'm on the fence with this one.

In EU we had a discussion about link tax and some related areas regarding remuneration for content providers by content distribution platforms (search engines - prominently Google and social media sites like FB, Twitter, etc.). Back then I was strongly against this new laws as some of the newly proposed regulations bundled with link tax I considered potentially harmful for e.g. opensource software managed in GitHub repos.

When I think of the link tax now and after reading this rant defending FB, I lean towards the link tax.

Quotes like make me shiver: "First is the link tax. This is fundamentally against the principles of an open internet. The government saying that you can't link to a news site unless you pay a tax should be seen as inherently problematic for a long list of reasons."

I don't think we should put FB and open internet in the same sentence. Sharing the news is available all the time, through e-mail, slack, SMS or any other mean of electronic communication. It has just stopped being available through FB (a private, global corporation) who will STOP benefiting from it.


What service in their right mind would link to any content that requires payment? Do media organisation think their content is so great that people will pay just to link to it? Links drive SEO, charging for them is the equivalent of shooting yourself in the foot.

This can only work if governments force companies to display selected media links and then charge them for it. It can only work if governments force search engines to alter their algorithms to preference media organisations and charge the search engine in the process.

It is the worst piece of tech related legislation conceived in a so called democratic country, the fact that its even progressed to this stage is mind boggling.


I think the issue is that in many cases it does not bring traffic to the news outlet : people read the title/summary which is shown by google/Facebook, and don't click on the link.

News outlet therefore assume that it's unfair, since they don't get the corresponding advertising revenues, and Facebook does instead.


Isn't this the same problem of people glancing at the front page of a paper and not buying it? Yet apparently newspaper companies put that clear plastic in there so that the front page could be read, the assumption being that seeing the front page headline is more likely to make you buy the paper than not seeing it.

So too, the point of "clickbait" is that they are doing everything in their power to make you click. So the assumption that there is a much larger audience that would click if only they didn't have access to the link from Facebook is, well, difficult to believe.


The issue with this law is that there only opt out is the one that Facebook have taken. If they allow links to any news providers, then they must allow links to all news providers, and pay for them. And if they cannot reach a fair agreement, the arbitration is compulsory - ignore the value Facebook gives to News Corp, consider the value News Corp gives to Facebook.

It's possible to come up with decent regulation/law that does what this one is nominally advertised as doing. But this one is unreasonable.

(Also, the way we got to this point is worth considering. Murdoch spent a decade kicking out prime ministers and destroying democratic oversight. Once he'd finally abolished any expectation that democracy serves the people, he wrote a law and gave it to the parliament to move wealth from his successful advertising companies to his own less successful advertising companies.)


HN links to articles in 99% of its posts. If HN had to pay a link tax, would it survive?

I hate FB too, and avoid their products, but Australia's new law is Mafia style extortion, and we need to separate our feelings about FB when judging this law.


Japanese did their fair share of disgusting things during WWII and occupation of China, Korea, but the article sounds like and old joke about a mugger who says he saved a girl today. He didn't mug her, so that counts as a save.


we need to go deeper ...


so it's time for Carbon-14 then ...


Carbon-14 datings works because once an organism dies, it stops exchanging carbon with its environment, so C14 stops coming in. Carbon dating any living organism will come back with "it died 0 years ago/hasn't died yet", give or take whatever details make it more complicated in reality than it is in theory, as it always is.


As I recall, there's also an issue with radio-carbon dating anything after atmospheric atomic weapons testing commenced.


That problem started earlier, when we started burning significant quantities of coal. https://en.wikipedia.org/wiki/Radiocarbon_dating:

”Research has been ongoing since the 1960s to determine what the proportion of 14C in the atmosphere has been over the past fifty thousand years. The resulting data, in the form of a calibration curve, is now used to convert a given measurement of radiocarbon in a sample into an estimate of the sample's calendar age

[…]

Because the time it takes to convert biological materials to fossil fuels is substantially longer than the time it takes for its 14C to decay below detectable levels, fossil fuels contain almost no 14C, and as a result there was a noticeable drop in the proportion of 14C in the atmosphere beginning in the late 19th century. Conversely, nuclear testing increased the amount of 14C in the atmosphere, which attained a maximum in about 1965 of almost twice what it had been before the testing began.”


Is there any dead carbon, like perhaps in the teeth? Have never heard of it being done, but would be neat.


Apparently it is possible to do radiocarbon dating on tooth enamel. Enamel contains 0.4% carbon and is indeed frozen at time of development. Unfortunately, we're limited to post-nuclear testing because without the atmospheric diffusion C14 has very low precision due to the half-life of 5730 years.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2957015/

https://www.nature.com/articles/437333a


Thanks! That's interesting, to use the post-nuclear curve as a modern clock, instead of using C14's half life for an ancient clock. For a given tooth they seem to assume all enamel was from one date, during childhood.


Teeth mostly consist of hydroxyapatite, which doesn't contain any carbon.

Not 100% sure, but for someone that just died, you might be able to do it with their brain, as brain cells are hardly ever renewed. For a living human I don't think there is any way that's non-invasive.


That thing about brain cells not regenerating is a myth, btw.


Could you provide a source for that?

I recently tried to read up on the topic and everything I found stated that brain/nervous cells are generally barely regenerated and can stay with you your whole life. Granted, most articles trace back to this source[0] from 1967, but if newer information exists, they've done a really bad job at spreading it.

[0]: https://bionumbers.hms.harvard.edu/bionumber.aspx?&id=101940


Nope. As long as you're alive, you're breathing and taking in C14 from the atmosphere (it's produced by cosmic rays turning N14 into C14). When you did you no longer breathe and the C14 decays. So radiocarbon dating measures the time from the death of the organism to the present. edit: ninja'd.


Maybe there's a part of the body that stops exchanging carbon with the lungs after a certain point? Like maybe the teeth or the corneas.


Assuming this worked, it seems a bit extreme.

You want to validate the age of a person, so you remove one of their teeth or take a tissue sample from their eye?


"The Orville" had an episode where an alien culture used a procedure that sampled teeth to determine the exact birthdate of a person.

There's not much in the tooth apatite that would be useful for radiometric dating on the time scale of a single human lifespan, though. I don't think you could do it with radiometrics, no matter what you sampled. Anything precise enough to indicate one year within a range of 100 would probably kill the animal with its radiation, even if it were naturally common enough to be taken up by living organisms.

It may be that a tooth is an easy source of original DNA, and approximate age may be extrapolated from accumulation of copy-error mutations in different types of cells in the body, which would be affected by the frequency of division. So by comparing DNA found in the tooth mineral against that found in the tooth nerve cells, and in the tooth blood supply, it might be possible to narrow the range of possible whole-organism ages by referencing mean mutation rates, and then further narrow it by checking telomere lengths. Seems like that might vary somewhat by individual, and their history of radiation exposure.


The half life of Carbon 14 is 5,730 years. So all it would be able to do is tell you the person is between 0 and 5,730 anyway right?

That wouldn’t help solve these cases.


No, the radioactive decay will follow a (nearly) continuous exponential curve. Half-life is just an arbitrarily chosen point on that curve: the point at which half of the atoms have decayed. With carbon dating you are basically solving this equation: percent of carbon 14 left = 0.5 ^ (years since death/5,730). If it has been 500 years since death, you would expect 94% of the original carbon 14 to remain.

That said, you generally have error boundaries spanning multiple decades and it measures since time of death (when new carbon stops being integrated into the body) not time of birth, so it would not be useful for this.


Note that you can use tooth enamel, which freezes its carbon at time of development (instead of death). https://news.ycombinator.com/item?id=20627335

That said, the error bars are simply too large to do anything reasonable for dates prior to 1955. The difference between 80 years and 110 years is 98.7% vs. 99.0% of the original C14 concentration... and while you may be able to get a very precise measurement of the remaining C14 you also need to very precisely know the baseline from that era to determine the percentage. Thus typical radiocarbon error bars are at least ±60 years.


Thanks for the details.


Oof, HN does not like jokes.


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: