The best way to stop global warning is through intervention by Central Banks. We need to create monetary policy which promotes incentives such as green energy and carbon capture technology. We also need to focus on improving battery technology to store excess energy for later use.
We need to introduce innovation permits to restrict who is allowed to work on green tech so that we don't waste effort reinventing the wheel. We need to bring all the top green energy minds under one roof with one director at the top who has a PhD and a proven track record of success at raising capital.
The key to innovation is to reduce competition and increase collaboration. With such focus, it's critical that we don't put the wrong people in a position of power. We need a solid proven track record in academia, politics and/or business - The only way to do this is by looking at the financial track record. The stock market is the only mechanism which we can trust for ranking top people. Mediocre people who can't achieve success in such easy economic times should not occupy positions of power. If they can't make it during such easy fiscally expansionary times, how will they perform during hard times?
We need better leaders with a proven track record and top credentials from top universities.
To summarize, you are asking for a hero to come and to save us all with the power of meritocracy. This hero of yours is a PhD executive (from a top university) with access to venture capital and will lead other eminent people to produce world changing green technologies. The stock market will be the bar for which this hero is selected.
Companies that exploit the environment already have PhD executive types (from great schools) with great track records and amazing access to capital already. That is part of the problem. These eminent people have already decided that their personal wealth now is worth destroying our future.
These geniuses have already found a solution: rampant hedonism. Squeeze out every moment of pleasure while you still can and when time is up, jet over to New Zealand and hunker down while the world is engulfed in flames.
First of all, to have any hope of of reducing global warming to mild proportions, energy usage will have to shrink, and so will the economy. We are simply too far advanced for any green energy solution and magical carbon sequestration to be able to help in time while also growing the global economy. Since we also want the global south to reach decent levels of development, the rich countries will have to accept even more shrinkage in their economy to accommodate rising consumption in the poorer ones - especially those hurt the most by the last few centuries of industrial progress.
This shrinkage will happen either way, due to the effects of global warming in theatter half of this century - today, we still have an option of doing it in a somewhat controlled manner. Tomorrow we will not.
Secondly, it is important to realize that it is theoretically impossible, not just technologically, to produce a battery that has the energy density of gasoline or most fossil fuels. The best realistic batteries we can think of are about 6MJ/kg, compared to 44-50 MJ/kg for hydrocarbons. This also means that transportation will become much more expensive in the coming decades, especially for goods - a battery powered cargo ship will transport ~10 times less cargo per journey than today, and so will an electrical truck (though the truck would likely be replaced by rail transport).
Innovation can harm efficiency by creating needless competition which is wasteful.
That's why innovation should be in the hands of corporations. The government should introduce laws to require innovation permits to prevent wateful competition.
These permits should be expensive, that way only big corporations could afford them so that will ensure maximum efficiency.
The vast majority of top projects are essentially scams and the real projects are not in the rankings. Even promising projects seem to turn into scams once they get into the top 200 or so. It's as if some wealthy investors buy up a lot tokens from top projects and then threaten the founders to crash the price unless they stop development and start wasting time. I've worked in the blockchain industry for several years and I've seen founders who used to constantly make great decisions start making one terrible decisions after another - It feels like they switched from being productive to being counter-productive from one day to the next. It's as if they're getting paid to NOT innovate.
I'm 100% sure that there is some manipulation going on but I can't figure out why (though I have some wild theories).
Name any major project and I can tell you why it's a scam.
Bitcoin: Uses the electricity of a country to process 2 transactions per second. Layer 2 solutions such as Lightning Network have some significant drawbacks which make them unpractical and vulnerable to multiple attacks. They've been trying and talking it up for years - No results.
Ethereum: Doesn't scale. The entire ecosystem (including all ERC20 tokens) together cannot process more than 30 transactions per second. New ERC20 tokens have to pay the same HUGE (e.g. $20 per transaction) fees as the mainchain; all tokens slow each other down (compete for resources from each other and drive up each other's transaction fees). They said that sharding was essentially ready years ago but now they've basically canceled it (or 'put it on the backburner' as they like to call it) in favor of extremely complex and vulnerable layer-2 ZK-Rollups solutions which are completely unproven (we don't know what will happen when many projects start adopting rollups; expensive on-chain interactions still need to happen).
Polkadot: They claim everywhere to have 'Parachains'. The reality is that this feature doesn't exist yet. The way it's designed is extremely complex and the scalability benefits are limited because there can only be a limited number of parachains.
Also, one thing which almost all the projects have in common is that they're mostly targeted at developers... Yet as a developer, there is almost always a MASSIVE amount of friction involved in setting up and integrating the blockchains with other systems. For example, Ethereum requires minimum 300GB of disk space to run a node (you need to run a node to do any serious integration testing). Also, the Ethereum node doesn't even provide a basic search feature; you need to use CENTRALIZED third-party services in order to search the blockchain data (that's because the node writes to a file instead of a proper database)... OMG. I could go on and on and on. There is just so much money behind these projects that the entire community will constantly twist the facts and present a severely distorted view of reality.
There is no limit to the amount of deception and self-deception when there is money involved.
These are all great criticisms, but I struggle to agree with you that they make these projects "scams".
By way of analogy, I remember the ruby on rails really struggled with scalability for a long time, and it was a big problem that lots of people, both proponents and opponents, talked about a lot. Nonetheless it turned out to be quite useful and definitely not a scam. I saw similar dynamics in both AngularJS and React. I'm trying to think of a good example on the other side, something that was hyped but criticized and didn't really succeed due to its criticisms being right ... maybe something like Meteor, it seemed promising but flawed and never really overcame its flaws. But none of these were "scams", just different flawed projects that succeeded or failed despite or because of their flaws.
By my lights, the top two you mentioned (I don't know enough about the third to say) fit very much into this same mould, I think they are flawed projects that will succeed or fail despite or because of widely recognized flaws which are or aren't eventually overcome. But not scams.
I think scams have to have a component of intentionality, that all effort at appearing legitimate and promising is conscientiously only for show. Contra that, I think lots of people are making a good faith effort to make bitcoin and ethereum useful. They may very well fail, but I don't think most people involved are conscientiously doing the work just for show.
I think making promises and taking money from people based on promises you very likely can't fulfill is enough to label something as a scam, and that seems to be the case with all of the points mentioned above.
Is this also true of all startups that take funding and then fail? That would be one reasonable definition of "scam" I think, but personally I think it is more useful to have different terminology for speculative high risk ventures that make a good faith effort but fail vs. malicious schemes designed only to take money and run. I think there are lots of both things in this cryptocurrency space, but I think it's reductive to throw your hands up and say they're all the same.
Debating myself a bit: the reason startups tend to attract fewer scams is that only accredited investors can invest in them. There is definitely something to be said for that!
Generally you're only going to get decent money from bigger names (or at least one leading the round) - and a bad reputation will make it quite unlikely you'll get to play again. This helps avoid many straight-up scams like Dentacoin (lol).
Then, your future rounds of investment are conditional on demonstrated success. Your A can be a bridge round based on traction or a materialized idea. However, your B is generally based on hard numbers.
ICOs tend to get series F/G money up front on a hope and a prayer.
Maybe not completely, but they all profit from having an element of deception. They are drawing attention away from better projects by hoarding all the top spots on all the exchanges and ranking websites. They are destroying the industry and hurting people IMO.
It is not clear to me where the "deception" is, even if the rest of what you said is right. The things mentioned in this thread are common knowledge and widely discussed.
> Bitcoin: Uses the electricity of a country to process 2 transactions per second. Layer 2 solutions such as Lightning Network have some significant drawbacks which make them unpractical and vulnerable to multiple attacks. They've been trying and talking it up for years - No results.
Can you be more specific about the drawbacks with layer 2 solutions such as lightning network? I use lightning from both the business and the consumer side, and from my perspective, it works just fine. I am able to make payments with negligible fees that settle instantly, and people are able to pay me (business) without any real problems. At this point, the vast majority of Bitcoin transactions I do settle on a layer 2. Frankly, it just kinda "works".
> Ethereum: Doesn't scale. The entire ecosystem (including all ERC20 tokens) together cannot process more than 30 transactions per second. New ERC20 tokens have to pay the same HUGE (e.g. $20 per transaction) fees as the mainchain; all tokens slow each other down (compete for resources from each other and drive up each other's transaction fees). They said that sharding was essentially ready years ago but now they've basically canceled it (or 'put it on the backburner' as they like to call it) in favor of extremely complex and vulnerable layer-2 ZK-Rollups solutions which are completely unproven (we don't know what will happen when many projects start adopting rollups; expensive on-chain interactions still need to happen).
What makes ZK-rollups "extremely complex and vulnerable"? And perhaps touch on optimistic rollups as well (since these are about to launch and will have dramatic increases in throughput as well)?
It seems to me that you are making grandiose claims of problems without any real evidence.
When Stellar started, they were all about 'Quorums'; trying to imply that this was the secret sauce which would allow it to scale unlike any other blockchain. I initially thought that quorums were like separate shards but after asking around years ago, I found out that it was not the case; all transactions pass through all nodes; exactly the same as a plain old blockchain. These days they barely even mention the concept of a 'quorum' because it was never anything more than a scammy marketing tool.
I don't know too much about Cardano so I won't criticize too much but when I skim-read their whitepaper about 1 year ago, it sounded over-complicated. This is a red flag for me. Also, they are yet to implement smart contracts; so there is a long way ahead. I don't like that they keep bragging about their all-PhD team. In my experience, PhDs aren't good at delivering good developer experiences or limiting the amount of complexity.
I wouldn't say that XRP/Ripple is a scam; but only because they don't make it a secret that they are essentially a centralized crytocurrency with multiple nodes for redundancy. But some could argue that they are a scam based on the fact that they don't solve any of the problems that a cryptocurrency is meant to solve (this critique pretty much applies to all top cryptocurrencies BTW; they don't solve any significant economic problem aside from upholding the status quo; the opposite of what they claim to do).
>> So it's a scam because it doesn't exist yet and it's planned on the roadmap?
It's scammy because they sell it as if it already exists, but it doesn't.
Ripple likening themselves to other cryptocurrencies when they are centralized is scammy in my book. There is a reason that video game currencies or Magic Cards aren't listed on exchanges.
Our society is not capitalist. New currency is constantly being printed out of nothing and pumped into the system. That's not a free market. Businesses just follow the money. It can be as inefficient as society allows it to become. As long as most people can ignore the reality.
If the banks were only loaning people money to collect sea shells, then the entire global economy would end up revolving around sea shell collection.
Then some rational people would eventually write books claiming that seashell collection is a bullshit job, we don't need so many seashells... Then people like you would post comments saying "That's not true, obviously we need seashells because capitalism would never allow for such inefficiency."
> Our society is not capitalist. New currency is constantly being printed out of nothing and pumped into the system. That's not a free market.
I don't know which it's more of; mysterious or annoying that people keep repeating this... simplistic fallacy. WTF does one have to do with the other?!? "Capitalism" or "a free market" is how the game is played or what it is about; "currency" or the monetary supply is just the markers used to count points in the game. They're not the same thing.
In the game of Monopoly new currency is injected constantly, when players get 200 $ (? IIRC) for completing each lap around the board. Does that make the goal of the game not about amassing all the money[1] and bankrupting the other players? No, of course not. The goal of the game and the markers used to count progress towards that goal are different things.
Conflating them is committing a stupid fallacy, even more so in real life than in the game.
____
[1]: All the money in circulation, regardless of how much that happens to be.
There is a massive difference. In monopoly, everyone gets the same amount when they cross 'Go'. If we had taken the rule from the real world, people would be paid proportionally to their net worth (value of all assets). Real life is even less fair than monopoly in that aspect.
If two people own shares of a company which grows at 10% per year; one person owns $1 million worth of shares and another owns $1000 worth of shares. The first person gets $100K worth of additional wealth in the first year, the other person gets only $100... Then the following year, the compounding effect increases the gap between them even more.
Even if that growth was completely natural (no central bank intervention), it would still be unfair... But what makes the system so incredibly unjust is that all this growth is ARTIFICIAL. The central banks print money and push it into the economy; the institutions on the front line then basically use that new money to pay one another for services; thus wiping out each other's debts using the freshly printed money. It's a giant, multi-layer pyramid scheme.
Nothing like a bit of reduction ad absurdum to start the day.
I don’t deny that there is some wasted effort, what I argue against is that the amount of truly wasted effort is significant and that the measure for whether or not a job is bullshit should be that the person doing the job thinks it shouldn’t exist.
We are capitalist in as much as you and I are allowed to build a machine and pay people to run it. I agree there are no free markets, all markets are created by governments just as all currency is.
This article is garbage. The argument is basically like saying "famous scientist X was wrong about Y, let's stop doing science. Clearly there is no point to it."
I cannot believe what I am reading here.
My open source community knows exactly what good code looks like and we've delivered great products in very short timeframes repeatedly and often beating our own expectations.
These kinds of articles make me feel like I must have discovered something revolutionary... But in reality I'm just following some very simple principles which were invented by other people several decades ago.
Too many coders these days have been misled into all sorts of goofy trends. Most coders don't know how to code. The vast majority of the people who claim to be experts and who write books about it don't know what they're talking about. That's the real problem. The industry has been hijacked by people who simply aren't wise or clever enough to be sharing any kind of complex knowledge. There absolutely is such a thing as good code.
I'm tired of hearing developers who have never read a single word of Alan Kay (the father of OOP) tell everyone else how bad OOP is and why FP is the answer. It's like watching someone drive a nail straight into their own hand and then complain to everyone that hammer and nails are not the right tool for attaching two pieces of wood together... That instead, the answer is clearly to tie them together with a small piece of string because nobody can get hurt that way.
Just read the manual written by the inventor of the tool.
Alan Kay said "The Big Idea is Messaging"... Yet almost none of the OOP code I read designs their components in such a way that they're "communicating" together... Instead, all the components try to use methods to micromanage each other's internal state... Passing around ridiculously complex instances to each other (clearly a whole object instance is not a message).
> The argument is basically like saying "famous scientist X was wrong about Y, let's stop doing science. Clearly there is no point to it."
In my opinion the argument is more "famous paper X by scientist Y was wrong, let's stop citing it". Except that Clean Code isn't science and doesn't pretend to be.
If the article only attacked that specific book "Clean Code", then I would not be as critical. But the first line in the article suggests that it is an attack against the entire idea of writing good quality code:
'It may not be possible for us to ever reach empirical definitions of "good code" or "clean code"'
It might seem far fetched that someone might question the benefits of writing high quality code (readable, composable, maintainable, succinct, efficient...) but I've been in this industry long enough (and worked for enough different kinds of companies) to realize that there is an actual agenda to push the industry in that direction.
Some people in the corporate sphere really believe that the best way to implement software is to brute force it by throwing thousands of engineers at a giant ball of spaghetti code then writing an even more gargantuan spaghetti ball of tests to ensure that the monstrosity actually works.
> is an attack against the entire idea of writing good quality code:
> 'It may not be possible for us to ever reach empirical definitions of "good code" or "clean code"'
I read it as an attack against the idea that there are hard and fast, objective and empirically verifiable rules for good quality code. The Clean Code book itself is a perfect example of how subjective such rules are. I feel really sorry for you if the only methods for software development that you know are "brute force it by throwing thousands of engineers at a giant ball of spaghetti code" and sticking a book that has some fanatic supporters. "Readable", "maintainable", "succint" or "efficient" don't really describe Martin's examples and many functional programming enthusiasts would question "composable" too. Yes, I wasted several hours of my life reading that book and I'm never getting them back.
I never said that this is what I believe. I said it's what a lot of people in the corporate sphere believe. It's the opposite of what I believe.
OOP solves complex problems in a simple way.
Functional Programming solves simple problems in a complex way.
Some concepts from FP are useful when applied within OOP, but pure FP is simply not practical. It doesn't scale in terms of code size, it's inefficient, it's inflexible, it takes longer to develop and maintain, it's less readable because it encourages devs to write long chains of logic spread out across many files. FP's lack of emphasis on important concepts such as blackboxing, high cohesion and loose coupling encourages developers to produce poor abstractions whose names sound highly technical but whose responsibilities are vague and impossible to explain without a long list of contrived statements which have little in common with one another.
Abstractions in FP tend to be all over the place. It seems to encourage vague, unstructured thinking. Decoupling state from logic makes it impossible to produce abstractions which are high cohesion and loosely coupled. It forces every component to mind every other component's business.
This is madness. If you don't care about structure, why not just write the entire system as a single file and define thousands of functions which call each other all over the place? You would get the same spaghetti, but you would save yourself the effort of having to jump around all these files which don't add any meaningful structure anyway.
It's always the case that when I present these arguments above to FP devs, they respond with personal insults instead of counter-arguments. This suggests that they know my arguments are accurate but they are too invested in FP and are in denial - It's emotional, so they respond emotionally.
You shouldn't think of it like "I've been fooled and wasted all this time on FP". You should think about it like "I gave FP a thorough analysis over several years and it was a worthwhile experiment which didn't work out but I learned a lot from it".
I also spent quite a lot of time working with and reading FP code over the last 15 years. That's why I can criticize it with confidence today. It was not a waste.
Comments like "Functional Programming solves simple problems in a complex way, "Abstractions in FP tend to be all over the place." or "If you don't care about structure, why not just write the entire system as a single file and define thousands of functions which call each other all over the place?" don't really invite polite discussion. To me they look like you either are reacting on pure emotion, or don't understand what you are writing about (note that this opposite of "accurate"). It also seems that you think that I'm whatever you believe "functional programmers" are.
If there's something we maybe agree on, it's that enforcing pure functionality (I guess that's what you're writing about. If you think functions as first class objects is bad, you're incorrigibly wrong) is about as bad as maximal OOP that Martin is preaching. There's a lot of space between these two extremes if you're willing to look. And a lot of money to be made by sticking to an ideology and preaching it.
It would be an interesting experiment if you could show me the GitHub repo of the best written open source FP project you've ever encountered and I could point out its flaws and rank them on a scale based on how critical they are in terms of maintenance and performance.
> But the first line in the article suggests that it is an attack against the entire idea of writing good quality code:
> 'It may not be possible for us to ever reach empirical definitions of "good code" or "clean code"
If you re-read that line you quoted you may find that it talks of the (im)possibility of defining what "good code" or "clean code" is, not of actually writing it.
Functional programming is OK up to a certain degree of complexity... Good if you have a lot of junior developers on your team because it prevents certain kinds of mistakes which juniors tend to make. But many FP projects end up becoming a giant pile of spaghetti code eventually.
I think this is because FP doesn't put emphasis on separation of concerns like OOP does. FP advocates separating state from logic as being more important than separating concerns across components.
With OOP, if some piece of state concerns the same business domain as some piece of logic, then they should be co-located. On the other hand, FP will happily violate this principle in favour of keeping state and logic separate.
Having built highly complex, modular systems with OOP, I don't see how it would have been possible to implement these systems in such a modular way using FP... Co-locating related state and logic is absolutely essential to achieve modularity.
I have yet to see any complex FP system which was not spaghetti code.
Obvious troll is obvious but... I assure you there are many high complexity systems in production today using FP that are not spaghetti code. Just like there are OOP codebases which are spaghetti, abstractions are used or abused. I’ve seen both very bad and very good Ruby projects. OOP wasn’t the direct reason why they were one or the other.
Yes, but the difference is that OOP can scale indefinitely but FP cannot.
If you don't co-locate related state and logic, ensuring that each piece of state reaches the correct components becomes a logistical nightmare at scale (either needs to traverse many intermediate components or needs to be exposed globally to all components).
If you don't allow components to mutate state locally, you need a strategy to distribute all your data directly from external stores to the correct components and also a strategy to send state updates back to the relevant stores. When you have multiple components which share some of the same data, this is impossible to do in an efficient and reliable way because of the latency between the external store and the components (components risk overwriting each other's data by sending conflicting updates, for example).
It's a logistical nightmare, because, without co-locating state with logic in some components, you cannot control how state updates are combined and then applied onto the external store. Components have no awareness of each other so they cannot coordinate. Every read and write has to go through the external store which is both inefficient and unreliable when you factor in latency. (The idea of caching violates FP principles).
The advantage of co-located state is that it can be updated synchronously without any latency so there is no possibility of conflicts.
>> If you don't allow components to mutate state locally,
Code mutation is, in my personal experience, a shit show. With Elixir, you I have to worry about some random process mutating your data because it can't, as it's literally immutable. I have never, not a single time, wished I could mutate a data structure in Elixir, because I can think of no case where it makes my life easier. Even quasi-objects, in the C++/Java OOP sense, like GenServers w/ internal state may appear to be mutating data from the outside, but from the inside they still rely on copying the data to update the state. It's so much easier for me to reason about.
Maybe different things simply appeal to different people. I could make arguments for OOP being harder to scale, but maybe that's just true for how my brain works.
State mutations are safe and easy to handle if the state is fully contained inside a blackbox (a class) and you only return copies of the state but never the actual references.
A blackbox should never expose object references (its internal state) to its parent components. It should also avoid passing object references to child components unless it's sure that the child component will never mutate this state in an unexpected way.
That doesn't align with my experience. In my experience, they pretend to care about the employee, but when the employer is in the wrong, even in an obvious way, they will side with the employer in the blink of an eye every time instead of trying to reason with the employer.
From worst colleague: Succeeding in our modern economy has nothing to do with ability, intelligence or work ethic. It's about luck and intimidation.
From best colleague: 10x, 100x, 1000x... developers are real. I also learned that making bold, controversial claims will yield respect dividends in the future when they turn out to be true. For one of my past startups, I was working closely with the CTO who was at least a 10x or 100x and some of the stuff he was saying seemed to defy a lot of the industry practices at the time. I didn't take what he was saying very seriously at first, but over the years, I ended up discovering that essentially everything he said was true and lead to huge productivity increases. It had a big impact on me to understand that software development is a craft which can be mastered on a whole different plane beyond what the vast majority of developers can imagine. I consider a lot of his teachings to be 'trade secrets' and I try to pass them on to other developers I care about (especially open source collaborators).
There are so many. Sometimes they're big picture stuff, sometimes they're more detailed.
One of the lessons I learned in my junior years is that I used to write CSS with a lot of nesting and relatively short class names (I used SCSS which made nesting easier) - The CTO kept telling me that I should avoid using more than 2 levels of nesting and to use long descriptive class names. I felt that was against the whole point of "cascading" in CSS. I soon discovered (during the first redesign) that when you have a lot of nesting in the CSS, it becomes extremely difficult to refactor the layout of the application; moving containers around breaks the layout every time. Whereas if you have a relatively flat CSS, you can move around components easily without breaking anything. It seems so obvious in retrospect.
I also learned the importance of having one source of truth and the importance of making sure that each kind of data flows through the system in a single, clear direction (or else you can get glitchy behaviour with certain kinds of realtime data). A common one is to make sure data flows from the source of truth. For example, when you click a component, it should update the URL hashbang, then other components should react to the change in the URL... Don't make the components react to the click directly and only update the URL hashbang as a side effect (because the click event could conflict with the URL change event in some edge cases; e.g. cause the UI to be rendered twice is a common issue). The URL has to be the source of truth.
Another critical lesson was about separation of concerns and the importance of being able to justify all technical decisions using simple non-technical language.
But there are so many lessons. I was fullstack so I learned a lot of backend tricks as well. A different one seems to come up every day when I work with other developers on a complex project. I try to explain the reason as much as possible because sometimes they sound counterintuitive (or I should say counter-narrative) and they're not usually silver bullets so it's important to convey the nuance as well.
The main point is that a software developer's job is to fix stuff so you end up introducing complexity, but you will have a much easier life if you realise when you're about to introduce more complexity, and try to minimise it by thinking about your implementation choices. KISS, I guess.
Nice, this is a broader way to explain the effectiveness of my CSS approach and many other approaches. IMO, Modern developers should read fewer books about tools and more books about software philosophy. It's far more important.
It's weird to think that philosophy can actually yield productivity gains... Many people think of philosophy as being the anti-thesis of productivity.
A single developer who can complete projects in the same amount of time as 10 or 100 or 1000 regular developers to at least the same quality standard (but usually to a higher quality standard because code quality is key to getting that productivity gain in the first place and code quality usually translates to project quality).
That said there are projects for which a developer could be infinity-x compared to other regular developers. For example, a highly complex project which regular developers do not have the capacity/talent to complete, ever... Just imagine some very complex project programming a quantum computer or certain kinds of blockchain or distributed systems projects; some developers will never be capable of delivering such projects. It's beyond their innate ability and capacity for learning.
Project complexity maximizes the utility value of highly productive developers. It would be difficult to identify a 100x developer on a simple project (they might only appear as 2x) but they would stand out as a definite 100x on a complex project.
The problem with being a 10x or a 100x dev is that you never get paid commensurate to your value. You’re almost always better off (psychologically - unless you have mouths to feed :P) starting a business if you have business/marketing chops at all.
This is horrible... And just like that, now the government can introduce laws based on people's identity/group.
Well at least it doesn't affect me personally... I didn't get the vaccine but I identify as a vaccinated person. Deep down, I feel that I'm vaccinated.
We need to introduce innovation permits to restrict who is allowed to work on green tech so that we don't waste effort reinventing the wheel. We need to bring all the top green energy minds under one roof with one director at the top who has a PhD and a proven track record of success at raising capital.
The key to innovation is to reduce competition and increase collaboration. With such focus, it's critical that we don't put the wrong people in a position of power. We need a solid proven track record in academia, politics and/or business - The only way to do this is by looking at the financial track record. The stock market is the only mechanism which we can trust for ranking top people. Mediocre people who can't achieve success in such easy economic times should not occupy positions of power. If they can't make it during such easy fiscally expansionary times, how will they perform during hard times?
We need better leaders with a proven track record and top credentials from top universities.