I was part of an ACM programming team in college. We would review classes of problems based on the type of solution necessary, and learn those techniques for solving them. We were permitted a notebook, and ours was full of the general outline of each of these classes and techniques. Along with specific examples of the more common algorithms we might encounter.
As a concrete example, there is a class of problems that are well served by dynamic programming. So we would review specific examples like Dijkstra's algorithm for shortest path. Or Wagner–Fischer algorithm for Levenshtein-style string editing. But we would also learn, often via these concrete examples, of how to classify and structure a problem into a dynamic programming solution.
I have no idea if this is what is meant by "l33t code solutions", but I thought it would be a helpful response anyway. But the bottom line is that these are not common in industry, because hard computer science is not necessary for typical business problems. The same way you don't require material sciences advancements to build a typical house. Instead it flows the other way, where advancements in materials sciences will trickle down to changing what the typical house build looks like.
I imagine there's entire companies in existence now, whose entire value proposition is clean human-generated data. At this point, the Internet as a data source is entirely and irrevokably polluted by large amounts of ducks and various other waterfowl from the Anseriformes order.
My guess would be lack of actuators. For instance, this robot looks like it has an ankle that can only go up and down, but not roll like a human's. Also, I wonder if there's a center of gravity issue, as it almost always appears to be leaning backwards to even out.
I think it's still pretty impressive in its recoveries, even though there's an unnaturally large number of them necessary. About 8 seconds into the video on the homepage, it almost misses and ends up slipping off the second step. I've eaten shit at missing a couple inch curb, though I don't think "graceful" has ever been used as a descriptor for me. So the fact that it just recovers and keeps going without issue is impressive to me.
> So the fact that it just recovers and keeps going without issue is impressive to me.
I'm pretty sure that's just a matter of reaction speed and it maintaining a constant focus/vigilance on it's movement that you'd usually not reserve outside of some sports and situations pre-identified as deserving the attention due to danger, like concentrating on balance and not getting into a position that overstresses your joints when you know it's icy.
I'm not familiar with any of these communities. Is there also a general bias towards one side between "the most important thing gets the *most* resources" and "the most important thing gets *all* the resources"? Or, in other words, the most important thing is the only important thing?
IMO it's fine to pick a favorite and devote extra resources to it. But that turns less fine when one also starts working to deprive everything else of any oxygen because it's not your favorite. (And I'm aware that this criticism applies to lots of communities.)
It's not the case. Effective altruists give to dozens of different causes, such as malaria prevention, environmentalism, animal welfare, and (perhaps most controversially) extinction risk. It can't tell you which root values to care about. It just asks you to consider whether the charity is impactful.
Even if an individual person chooses to direct all their donations to a single cause, there's no way to get everyone to donate to a single cause (nor is EA attempting to). Money gets spread around because people have different values.
It absolutely does take some money away from other causes, but only in the sense that all charities do: if you give a lot to one charity, you may have less money to give to others.
The general idea is that on the margin (in the economics sense), more resources should go to the most effective+neglected thing, and.the amount of resources I control is approximately zero in a global sense, so I personally should direct all of my personal giving to the highest impact thing.
And in their logic the highest impact is to donate money, take high paying jobs regardless of morality, and not focusing on any structural or root issues.
Yeah, the logic is basically "sure there are lots of structural or root issues, but I'm not confident I can make a substantial positive impact on those with the resources I have whereas I am confident that spending money to prevent people (mostly kids who would otherwise have survived to adulthood) from dying of malaria is a substantial positive impact at ~$5000 / life saved". I find that argument compelling, though I know many don't. Those many are free to focus on structural or root issues, or to try to make the case that addressing those issues is not just good, but better than reducing the impact of malaria.
Hello fellow ex-employee of that bank. I was in a segment governed by PCI, and they wouldn't even let us touch Gaia in fear of the whole thing being declared in scope
That only works if the external API is handing off the entire subscription to Apple, up to and including payments. But the entire premise is to move away from being forced to use Apple for these elements, which makes it a non-sensical interpretation. In fact, that particular interpretation is the current status-quo -- apps use APIs to create subscriptions entirely managed by Apple.
If Apple does not control the actual subscription, but is only providing an interface for managing it, then Apple must then alert the actual owner of the subscription upon changes. There's then no guarantee that the code on the other side is properly handling that alert.
I'm not familiar with the design. Does the Cybertruck have a trunk, or are you referring to the truck bed? And if the latter, is it not intended that the bed is open-weather?
This is classic correlation is not causation. The thing about correlation is that it could be a causative relationship, or there could be another set of untracked variables that's causing some or all the effects, or it could be unrelated coincidence.
Now, maybe this is a difference between the study and the article. Maybe the study makes stronger claims here than the article does. But I didn't see anything in the article that claimed nor demonstrated causation, only correlation.