> what is the point of creating a language where everything is an object?
I think that's the ultimate culprit in everyone hating inheritance. If it weren't for Java, I think we'd all have a healthier view of OO in general.
I learned OO with C++ (pre C++-11), and now I work at a Java shop, but I'm luck that I get to write R&D code in whatever I need to, and I spend most of my time in Python.
In C++ and Python, you get to pick the best tool for the job. If I just need a simple function, I use it. If I need run-time polymorphism, I can use it. If I need duck-typing I can do it (in Python).
Without the need for strict rules (always/never do inheritance) I can pick what makes the best (fastest? most readable? most maintainable? most extensible? - It depends on context) code for the job.
Related to TFA, I rarely use inheritance because it doesn't make sense (unless you shoehorn it in like everyone in the threat is complaining about). But in the cases where it really does work (there really is a "is a" relation), then it does make life easier and it is the right thing.
Context matters, and human judgement and experience is often better than rules.
> If it weren't for Java, I think we'd all have a healthier view of OO in general.
>
> I learned OO with C++ (pre C++-11), and now I work at a Java shop, but I'm luck that I get to write R&D code in whatever I need to, and I spend most of my time in Python.
>
> In C++ and Python, you get to pick the best tool for the job. If I just need a simple function, I use it. If I need run-time polymorphism, I can use it. If I need duck-typing I can do it (in Python).
Those first two you can do in (modern) Java. The third is a mess to be avoided at all costs. Interfaces and lambdas will cover most reasonable use cases for polymorphism.
Yes, you're right. I've done that in high-performance code where I couldn't afford the double function call of a virtual function. I forgot about that.
I think that's the ultimate culprit in everyone hating inheritance. If it weren't for Java, I think we'd all have a healthier view of OO in general.
I learned OO with C++ (pre C++-11), and now I work at a Java shop, but I'm luck that I get to write R&D code in whatever I need to, and I spend most of my time in Python.
In C++ and Python, you get to pick the best tool for the job. If I just need a simple function, I use it. If I need run-time polymorphism, I can use it. If I need duck-typing I can do it (in Python).
Without the need for strict rules (always/never do inheritance) I can pick what makes the best (fastest? most readable? most maintainable? most extensible? - It depends on context) code for the job.
Related to TFA, I rarely use inheritance because it doesn't make sense (unless you shoehorn it in like everyone in the threat is complaining about). But in the cases where it really does work (there really is a "is a" relation), then it does make life easier and it is the right thing.
Context matters, and human judgement and experience is often better than rules.