87/100. Very interesting game, it definitely helped me start to understand why some people get so upset about incorrect kerning. Would have been nice to see a running total score throughout.
Very interesting article, especially the t_scope allocator - I never knew you could get GCC to perform that cleanup automagically. One minor grammar point: isn't a lock under contention a contended lock, not a contented lock?
In lieu of the implementation, I'd like to know if these allocators are themselves based on malloc or if you have some tricky assembly/kernel code going on somewhere.
Yes, I wish that cleanup was a portable C feature! Glad to see that GNU is trying something here, as it would serve as a prototype for standardization. Perhaps in a future version of C...
I haven't written much C++ in quite a few years - mostly do Ruby these days, so I'm sure I'm making some terribly embarrassing faux pas or other with the example below. But you don't need more than the C++ functionality to compose your own variations if you want more flexibility.
Better would be to just have the Scope class take a single function to run and establish multiple stack entries for them. This would be my version:
#include <cstdio>
template <typename tFunc>
class ScopeFunc {
private:
tFunc mFunc;
public:
ScopeFunc(tFunc func) : mFunc(func) {}
~ScopeFunc() {
mFunc();
}
};
template <typename tFunc>
ScopeFunc<tFunc> on_return(tFunc func)
{
return ScopeFunc<tFunc>(func);
}
int main() {
auto first = on_return([]() { std::puts("first"); });
auto second = on_return([]() { std::puts("second"); });
std::puts("Hello");
}
Note that this is C++11, using lambdas. Also that the output will be reversed from your expectation since it's a lifo, but that's probably actually what you want for real deferred behaviour and not text output.
It also optimizes nicely, which yours may not because the loop unrolling may be complicated and there's a higher chance of aliasing of the function pointers in the vector:
Good idea, this definitely accomplishes the same functionality, but it's done at runtime, rather than the compiler knowing all the functions at compile time. Perhaps a minor difference for most cases, though...
I would still prefer a C extension... Perhaps I should just use Go these days, though ironically all these memory allocation policies are useless in a GC language.
You underestimate the compiler. This formulation might be tricky, but the version I just posted (adjacent to this post) definitely leaves the compiler knowing exactly what's going to be called when the function exits.
No. The C vs C++ question is simple language vs comprehensive language. A simple language might be preferred for various reasons. For example, code is easier to review, which is Linus Torvalds argument. Bindings to other languages are easier to create. Compiler is simpler, which might be important for embedded and high-reliability jobs.
I find it incredibly annoying that laptop lights blink when they're on standby - I often put my laptop on standby when I sleep and then I have a light either blinking or "waxing/waning" in my face (I like to sleep in complete darkness). My improvised solution is to throw an item of clothing over the laptop now.
It was abuse, using a misogynistic word, in a department where there is already a dearth of women due to the hostility they face. Brushing it under the carpet by denying her experience isn't helpful. You don't need to read this story to learn of sexism in CS departments, just ask any woman who does computer science and you can hear plenty of stories.
Right, but it's an anecdote. The title of the piece suggests some sort of comprehensive study, but fails to deliver anything beyond a single story of friction between two people (one of whom clearly has women issues).
Its more about what was communicated/interpreted than the actual words.
Take for instance: "I'm not going to let some bitch tell me what to do" vs. "Hey bitches, whats up?"
the first statement could communicate that the speaker has issues with females telling him what to do. The second statement is playful and affectionate.
However, because the word bitch is so generic and well used, the first speaker could easily only have issues with unpleasant people, not females. Its not clear.
That's why it is better to talk about a person's pattern of behavior, rather than labeling an isolated incident as sexism.
where there is already a dearth of women due to the hostility they face
Where the hell did you pull that from? I suppose there's a "dearth" of men in nursing too? Can I infer an epidemic of misandry because we don't have 50% male nurses? Ludicrous.
just ask any woman
Yeah, because the plural of anecdote is data </sarcasm>
Can you really use this as an argument against the government protecting workers by ensuring they get a guaranteed amount of holiday a year? I don't think its valid to say EU workers have their pay downsized to compensate for enforced holidays, what about things like minimum wage? (After all, we are talking about people on the bottom end of the scale here, who have minimal protection from employers already)
"But the site acknowledges that they are largely not to blame. It reads: "While giving millennials grief is highly entertaining, we want to acknowledge that the woeful state of the economy is not their fault. These free issues and e-cards are intended to help a generation that could sure use a hand, not to blame them.""
The quote "These free issues and e-cards are intended to help a generation that could sure use a hand, not to blame them." does not seem very genuine to me. It is very difficult for people in my generation to advance themselves in this climate. Youth unemployment has risen globally, and personally I'm horrified at how this government (in the UK) is sabotaging the future of so many children and young people. These kind of glib pot shots were funny 10 years ago, when social mobility was possible but they come across as arrogant and out of touch in today's world.
Seems to be an ever increasing trend to replace Python components experiencing high load with ones written in Go. Is this Go's niche? Definitely a language I will be learning in the near future, it doesn't seem like it will be going away any time soon.
I've settled on Go to replace a worn out Java mess (otherwise a Python shop). We need the computational performance, and I do like the general feel of the language. I think this is something you're going to see a lot of going forward. It's the same niche Scala has been filling to an extent, but I personally think Go is a much better option (unless you need the JVM of course).
You might have misunderstood the parent, I think. It seems like he was saying that they used Java (and then Go) in lieu of Python because of the performance.
But to answer your question, the Language Shootout seems to suggest Java and Go are on the same plane in terms of speed. Take that with however many grains of salt you like.
It's also interesting that according to the Language Shootout Go uses a fraction the memory of Java - meaning you can save a lot of money by deploying it on cheaper machines or VMs and get similar performance to Java.
I just checked out the language shootout. Go has really pulled ahead from where it used to be. Faster than SBCL or Ocaml or Free pascal on quad core 64-bit - that's impressive.
I can use Scala and not pay that penalty. I'm looking at Go to replace components of my system where I don't want a full bore JVM, but I have to be thoughtful about latency. I prefer C for this (C++ seems to be the standard there). Would love to move to something like Go once I can.
I'd say Go's niche is pretty much any sort of network server. Right combination of high performance, easy multithreading, and making it hard to shoot yourself in the foot (no buffer overflows, for instance)
I've heard this several times. But the success stories are mostly concerned with smaller parts under heavy load. Go still is a lot harder to work with than Python if you know Python and its tools well...
With Go you don't have anything comparable to Django, Numpy, Pandas...
> With Go you don't have anything comparable to Django, Numpy, Pandas...
Given the age of the language I would assume building out its associated toolchain is only a matter of time. Python didnt ship with Django, Numpy, Pandas...
Python is rapidly becoming the lingua franca of data science; but there the vast majority of your inner loop isn't Python (numpy, scipy, pandas, numba, theano; C, Fortran, assembler, Cython, LLVM, CUDA...).
Basically Python is the glue language data people wanted, it turns out.
As are most of us lol. I would say if you want to use golang for a serious project and need additional libs you have to ready to write it yourself, which grants the opportunity to give back to the community and contribute to open source etc.
Will this change with time? I know numpy, pandas and matplotlib are likely relatively large projects, but my assumption is someone will likely put together data analysis / matrix math libraries to perform some of these functions.
I think Julia (http://julialang.org/) might be a better alternative to Scientific Python than Go. I'm not sure you can get the same flexibility/expressiveness you get in Numpy/R/Matlab in Go.
I find it rather annoying that everyone seems to be advocating Go, it may be a good language, but I can't help but feel that the only reason it's being used is because Google's behind it. You guys should look into alternatives like Nimrod (http://nimrod-code.org/), which is in fact a lot closer to Python than Go will ever be.
People are advocating Go because it delivers. It has great concurrency primitives (channels), it is "boring" language syntax wise, and it is fast.
Beyond that, it deals with a lot of the 'ugly' things around the edges of other languages. Dependency management, build management, deployments... all these IMHO are much more well thought out in go.
It delivers? Lol. The GC used to suck for 32 bit systems and it still sucks for realtime. As opposed to Nimrod's which pretty much guarantees a maximum pause time of 2 milliseconds -- independent of the heap size. And that's only the start: Go also lacks generics and meta programming in general. And its memory/concurrency model is just as racy as Java's.
Go was never designed for "realtime". Also, 32 bits wasn't the main compiler focus, 64 bits was. This problem being mainly fixed with the 1.1 release, this is a non issue now. The memory model seems pretty well defined without being too restrictive, with the recent addition of the race detector.. Go looks well equiped for this kind of problems and some pretty interesting projects are there to prove it.
You are correct on some points. 32 bit was broken, realtime is a non-feature. It does (by design) lack generics and meta programming (Pike talked about these at length at one point).
I have to disagree on the concurrency model, I think message passing channels are a much more natural primitive to model concurrency in, and goroutines are exceptionally light.
EDIT: When you talk about nimrod, you might go ahead and mention you are the designer of nimrod... it might color your judgement.
"Nobody is smart enough to optimize for everything, nor to anticipate all the uses to which their software might be put." -- Eric Raymond, _The Art of Unix Programming_
Go's primary niche is server software, and in that niche, it is gaining in popularity and has the backing of a large company. For servers, neither support for a 32-bit address space nor real-time support is important.
Does support for generics really matter when the language has built-in support for the most common collection types?
The allowance of shared mutable memory between goroutines does worry me somewhat.
Personally, having it backed by Google makes Go better in my opinion. I feel a ton of smart, very insane individuals are working on it and it can only get better.
Nimrod? Never heard of it or the person backing it.
I'm _not_ a language snob, trust me! I'm just a regular, family guy, software developer and I try to put my proverbial eggs in reliable baskets.
Go doesn't seem to be going away any time soon and it's really really fast.
If you want to put your eggs in reliable baskets then why not use Java, or C#, or even C++. Those languages are most definitely not going to go anywhere anytime soon.
I now have to yet again close my apps and restart my computer for an IT-forced Java runtime update, for some app I rarely use. For that reason alone I'd not consider Java.
I forgot that C# added more asynchronous stuff since I last coded in it. I used to love C# but it's more verbose than Go and I'm weary of the obfuscated MS documentation that pushes me straight to blogs. Also I gave up on using Windows for web development.
If I'm not mistaken, that's a fallacious slippery-slope argument. I'm still undecided on how to weigh language popularity against other factors, but surely relative popularity is an important factor in things that have a big effect on real-world software development, such as availability of libraries and tools. For now, at least, Go is much more popular than Nimrod, and also has more people working on the language implementation and surrounding tools.
I believe Go purpose is exactly that of replacing more familiar languages with a static compiled version that is palatable. It's fulfilling a niche where, otherwise, one would resort to C/C++.
If you're CPU bound, I think it's a good option to at least check out. It can work with real threads, and doesn't compete with the GIL. In our case, our actual CPU was low, but due to the high concurrency, any little bit of CPU work requires a context switch and contends with everything, for a little work. We alleviated that by just using a bunch of processes.
Will it replace all of our Python? Highly doubt it. Some very specific pieces? Probably.