Hacker News new | past | comments | ask | show | jobs | submit | GorgeRonde's comments login

Went to an IRC chat room when I was learning C in school. Asked if you could return a pointer to something that lives on the stack. Was talked down by an all-knowing dude telling me to go read K&R again. Proceeded to write a code sample [1] that showed it is possible (it's not really stable but works reliably in recursive calls IIRC).

I do not like this attitude (then again it was just one random dude).

[1] https://www.onlinegdb.com/HyO5VXRxS


> it is possible (it’s not really stable but works reliably in recursive calls IIRC). [...] I do not like this attitude

You might want to listen. You’re getting the K&R comment and the downvotes because this does not work, ever. It’s a really, really bad idea. In recursive calls, it might not crash right away, but you will have bad data, the memory at the pointer address will have been overwritten by the next stack frame that’s placed there.

Don’t ever return pointers to local memory because the memory is “gone” and unsafe to use the moment your function returns. Even if you try it and think it works, it can and probably will crash or run incorrectly in any other scenario - different person, different computer, different compiler, different day...

Your comments about getting a warning and ‘However if you wrap the local’s address... it “works”’ should be clues. The warning is the compiler telling you not to do it. The workaround doesn’t work, it only compiles. By using aliasing, you’re only tricking the compiler into not warning you, but the warning is there for a reason.


Listening to what ? To the dude that tells me that's not possible and proceeds to dump a big pile of authority on top of my head or to my own experiment that tells me another story ?

I would have preferred to be told:

- yes and no. You'll get warnings if you try to return a pointer to a local, however, doing this and that, you can manage to do it.

- but once you have achieved that, the result will be dependent on the way the stack is handled (not really in your control). You'll feel some comfort doing this in recursive calls, however beware of signal.h.

But this isn't the answer I received. I guess C programmers do not know the difference between what you can do (however risky) and what you shouldn't do. Also when someone asks such "weird" questions, do not assume he's a beginner with no notion of what constructs he can handle safely, maybe he's someone trying to find the limits of C – and once these limits are identified it can be a good conversation starter about C's internal and the way various compilers differ.

Edit: also downvotes on HN are not like downvotes on Reddit: there's actually a limit (-2 ?). Below this the comment disappears. Conclusion: only downvote when the comment engages in antisocial behavior (not respecting the rules or common human decency, etc ...), not when you disagree with it. I always upvote an unfairly downvoted comment for these reasons.


I was trying to help by explaining it, instead of saying go read K&R, but I don’t get the feeling you really heard or understood me. There is no other story. There is no yes and no. There is only no. You cannot manage to do it. It does not work to return local memory from a function, ever, period. Once you return, it is 100% unsafe to try to use the memory from your previous stack. There is absolute zero comfort in recursive calls.

You are mistaking some luck in having it not crash once for thinking that it’s okay in some situations. It’s not okay under any circumstances. That’s what makes this even more dangerous. Your program could crash at any time. It might run a thousand times and then suddenly start crashing. It might always run for you, and then crash on other people. But just because it runs once without crashing doesn’t mean it’s working.

A signal is not the only way your function’s stack can get stomped on the very next instruction after you return. Other processes and other threads can do it, the memory system can relocate your program or another one into your previous memory space. Recursive calls are guaranteed to stomp on previous stack frames when your recursion depth decreases and then increases, the previous stack will be overwritten.

Returning a pointer to a local stack frame is always incorrect. It’s not risky, it’s wrong.

BTW: you have the ability to see comments below the downvote limit, go to your profile settings and turn on showdead.

I didn’t downvote you, if that’s why you were trying to explain voting behavior to me, but you will find on HN that downvotes and upvotes both happen for a wide variety of reasons, and are not limited to either whether people agree, nor whether the comments are polite. Downvotes are often cast for comments that break site guidelines, for example just failing to assume good faith can get you downvoted. So can making blanket generalizations about a group of people, like the above “I guess C programmers do not know the difference...”. See the comments section here: https://news.ycombinator.com/newsguidelines.html

I sometimes upvote what appear to be unfairly downvoted comments to me. I usually upvote people who read and respond to me, regardless of whether I agree with them.


    - Java: no
    - Ruby: no
    - PHP:  no
    - C:    yes and no


?? I don’t understand what you mean. Those other languages don’t have pointers, they only have references, but what do they have to do with this?

Why do you still think there’s some yes in C? It’s not making sense yet that your memory is gone after you return? Returning a pointer to a local variable is exactly the same as calling delete or free on a pointer and then reading from it. You officially don’t own the memory after a return statement, so if you try to use it, then what happens is indeterminate. Again, since it doesn’t seem to be sinking in: it is always wrong to return a pointer to local memory. But, if you really really don’t want to listen, and you’re sure it works sometimes, then I say go for it!


Yes.

Signal handlers allow C programs to respond to events outside of the normal control flow (see signal.h, etc.). This means that once a function, say fnc1, has returned, the memory on the stack that was used by fnc1 can end up being reused at any point in time. A signal, perhaps generated completely asynchronous to the program itself by a different process, causes a stack frame to be allocated (possibly on top of fnc1’s old stack frame) for use by the corresponding signal handler. This could happen at any time, even before fnc1’s caller gets a chance to use the pointer returned by fnc1.


Thanks, that was interesting.


Fighting stupidity, laziness and a misplaced love for typography in software developers


I have no problem working twice as much as normal. What I want is to be paid twice as much (at least).


Good for you. I’ve tried working twice as much once for ten days, and I am never doing that again. Not even for a one-off with one hundred thousand dollars/Euros/whatever after-tax overtime pay.


So are you fine with the idea that I get twice as much money and decision power than you, all things beyond worked hours being equal ?


This is a disingenuous argument.

Working twice the hours does not mean you are putting in twice the effort or getting twice the results. You're more likely to be a complete liability, because you're tired, unfocussed and not capable of thinking straight. Doing this for a week or more is a recipe for disaster. As for this leading to "more decision power" than people who chose not to indulge in such antics, I've yet to see this in any role I've been in in the last 20+ years. Most places would not see such behaviour as a positive, either for you or your place in the team and organisation.

The human body is not a machine. We can't work long hours without significant mental and physical impairment. I've worked 12 hour day and night shifts, and seen the effects on me and the others around me. Extreme tiredness and disrupted body clocks lead to careless, stupid and costly mistakes. We were never meant to work such gruelling hours. I've worked 36 hours straight in a research laboratory. I was destroyed for several days following, wiping out any benefit of the effort I'd put in. And I made a silly mistake 32 hours in, ruining the whole experiment, due to one moment of inattention after 12 hours straight intense focus. Individual "superhuman effort" does not beat effective planning and efficient use of time and teamworking. Which is why so few companies want to rely on primadonnas, and instead want reliable and predictable people who won't flake out.

Today, I work regular office hours. Work hard while I'm there, and start and leave bang on time each day. I'm much better for it in all respects. And if you don't want to get severe RSI and burnout, I'd suggest at least considering these points. I didn't until I was severely affected by both, and it was not at all fun.


Money stopped being a major concern after my first year in my first job. Quality of life is more important to me. Your mileage may vary.


I’m surprised at the number of comments saying they don’t work 60/80+ hours a week.

I thought this was just the industry norm now and was going to factor it into my salary when I start looking for a new job

Honestly, isn’t it safe To assume the company is underreporting the workload and instead calculate Salary at an average of 60 hours/week?


Depends which specific part of the industry you work in, perhaps.

The company I work for has set working hours, with some flexibility in start and finish times. But it's basically 8 hours per day, 40 per week.

What surprises me most, is that companies aren't limiting work hours to 40. And that people put up with working stupid hours. 80 hours would be utterly insane. Even 60 is ridiculous. The evidence shows that people don't get more done if the hours are increased further. But people do need a life outside work, and any more than that and you have zero time to yourself or your family, and that's going to be detrimental to your health and happiness. Why isn't there more pushback against it?


Officially I work 35h/w. A little less than 40k€. Senior level. Very efficient. I think my actual hourly pay is under the minimum wage. No overtime pay of course.


In Romance languages the at-sign is called an arobase/arroba.


We have macros and LISP is a building material, not a language.

One line of code is just enough to get something that works (defmacro infix [a op b] `(~op ~a ~b))

There are libraries for that too. Clojure: https://github.com/rm-hull/infix


Although I have never used Julia, this is something that is possible in Ruby, and I think this feature is seriously underrated – i.e. to be able to jump quickly to the definition of external code, modify it and run it. When you work with multiple libraries/repo this is very valuable when debugging, saves a lot of time.

It should also be possible with Node.js I think (node_modules).


Don't get me wrong, I love Ruby, and my code was originally in Ruby, but the 20x speedup and better math libraries in Julia without sacrificing the speed of development too much made the switch for me very easy.

Actually translating Ruby code to Julia was much easier than I thought (the only real difference is the indexing).

The 1 based indexing really sucks in the PTX assembly output of Julia as well, I see a lot of useless increment and decrement operations when I don't expect it.


And he abandoned his schizophrenic son.


People from Nasca also built a wind-powered aqueduct system.

https://www.mnn.com/earth-matters/climate-weather/blogs/myst...


Some people get high on big frogs in Arizona. Or so said the urban legend. And then a few weeks ago Youtube suggested me a video (Joe Rogan maybe ?) where the differences between ayahuasca DMT and frog DMT were discussed. In the comments someone said. "It happened to my chihuhua ! I was walking my dog (I'm from Arizona) when that huge frog squirted his venom at her. The poor animal convulsed for 30 minutes."

Amazing.


You are referring to Bufotoxin / Bufotenin which is produced by some toads (of the Bufo family).

You can read more on Wikipedia:

https://en.wikipedia.org/wiki/Colorado_River_toad


That venom in it's raw form is poisonous to everything . It's highly dangerous but when smoked it's converted to 5-meo-dmt which isn't poisionous


The absence of abstraction being what exactly ?

Also beware of that kind of pseudo-wise thinking. Is there a point in climbing up a learning curve anymore ? Maybe you have a constant and massive stream of fresh bodies to throw at your "simple" solutions before laying them aside once those abstractions have clogged up their heads ... And maybe if you can unskilled workers at such a scale, it's because you have massive fundings as well...

More a question of financial optimization than software engineering I think


Before dismissing it out of hand, take a look at the Go language. It was designed to make specific kinds of common abstractions hard exactly because, when working at scale, programmers routinely create disasters by layering abstractions in a way that nobody can understand the consequences of.


This is exactly the kind of pseudo-wisdom that I read the GP as referring to, though.

In the case of Go, the core team saw the pain of indirection-masquerading-as-abstraction in complex Java/C++ codebases and considered the whole thing to be a boondoggle. As a result of this we’ve been saddled with a popular language in which two massive projects (gvisor and kubernetes) have had to hack their own expressivity into the language just to build complex software (i.e. codegen’d generics)

I worry about the cyclic nature of progress in our industry, where wonderful advancements can be made and then walked back or under-utilized because we aren’t patient enough to learn them thoroughly.


Go is Java 1.0 all over again.

If it's enterprise adoption ever goes beyond Kubernetes and Docker, expect GoEE and Go Design Patterns to make their appearance.

Worse, since its plugin support is really cramped down, expect any enterprise grade CMS to be built on hundreds of processes.

This happens all the time with simple languages, tons of library boilerplate code.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: