Wow. You really are not kidding. What a mess. I actually feel stressed looking at their home page. It's overwhelming. THEN they've decided to top the chaos off with animated Figma-cursors that have no apparent relationship to the hero content.
It doesn't get any better as you scroll either. Everything is animated and changes while you're trying to read it.
I tried to give them feedback through there "Contact Us" form and it's broken.
-----
Sketch. If you read this thread, please take this feedback seriously. It's shocking how unusable your marketing website is.
I thought it couldn't get worse until I tried to select text on the front cover and realized the whole thing is draggable. The designs on the side are draggable as well, but they're also images and it conflicts with normal image dragging on Firefox
You are missing the final key phase: everything gets crappier and we just accept it. There is no denying that LLMs are the future. But, much like the SEO optimized Internet and offshored goods, everything just sucks a little more.
Or hyper processed foods, and city planning that encourages sedentarism. Yes, congratulations to Kraft and Ford on their victory, now 38% of the adult population of the US has diabetes.
In western and more arid states they're a huge waste of water. Really depends on the climate. In other areas they're just a lost opportunity for biodiversity. Mixing in some clover or other small perennial soft small flowers can help a lot with making it both self-fertilizing and more pollinator-friendly.
(We kept part of our grass for exactly that reason but it helps that we don't have to water the lawn here. With added clover I'm quite happy with it.)
Mixing clover and other small flowering plants with grass is definitely the way to go. I can sit on the swing in my backyard and lose count of all the bumblebees and small butterflies bouncing across the lawn.
What I don't understand is, is the need for huge front lawns. It's all the work with none of the benefits. Just a waste of space really. People don't use them the same way at all.
Different neighborhoods work differently. I played baseball in my front yard. First (and only base, besides home) was the mailbox across the street.
But it's true that a lot of people don't use their front yards for much, except perhaps as a noise buffer from the road. If you go a non-lawn route, you'll need to be careful to maintain it in a way that doesn't encourage intervention by neighbors, municipalities, etc. On the other hand, I'm rewilding a bit of my back, and nobody says anything.
> What I don't understand is, is the need for huge front lawns. It's all the work with none of the benefits. Just a waste of space really. People don't use them the same way at all.
Have you considered that some people just like looking at it and that's enough? It's their private property after all.
The lawns aren’t the reason for the HOAs. The HOAs are intended to try to protect the investment value of the homes by enforcing a level of maintenance in all of the properties. Not saying they are a net good but it doesn’t start with the lawns.
The noise from all the grass mowing is really annoying (second only to leaf blowing). Fortunately, people are slowly switching to much quieter electric mowers.
The HOA in the neighborhood I grew up in mandated grass from a narrow list of types, and live oak trees. That's fine for about 15 years, give or take, but then the trees start to kill the grasses below them. This wasn't a groundbreaking horticultural discovery, but the rules were written that way nonetheless.
It took years for the neighborhood to convince the HOA to allow other kinds of ground cover (jasmine, frog fruit, etc.). In the mean time, people were paying yearly to resod with new grass, or have to pay fines that cost more than just paying for new sod.
It's not insane but it sure sounds like overkill to me. That's a good bit for just mowing, and once a week is significantly more than I would expect from someone who views lawn mowing as an expensive necessity.
As someone who has been doing this for a few years in Virginia, which is plenty wet, it's not really bad. I minimize mowing until May (a little earlier this year because its been so warm so early), and the first go of it is tough, but I put the setting on the highest; then I keep it relatively short by June or July. Meaning, its really no big deal.
The real issue, as far as I'm concerned, is what are you letting grow. If you just let it all out, it's not likely that you just have commercial grass growing, but a bunch of weeds. Now I like weeds, myself, but some are less desirable than others, particularly some invasive ones, and neighbors can complain (mine don't). My approach is to selectively weed the lawn (yes, by hand, and it's a half-acre) to try to rid it of the weeds I don't want, and selectively encourage those I do (e.g. in my case, the native fleabane which flowers in May). I do see a difference in the number of insects visiting fleabane compared to non-native weeds. Now, I'm in a rural enclave of an urban area in an environmentally sensitive part of Virginia, so my approach is different than it might be if I were in a California suburb, for example.
I get loads of dandelions and dock, which I find ugly (grandmotherly conditioning) and are not terribly good for bees.
In the UK at least, they say wildflower meadows flourish on poor soil, where dandelion, dock and grass struggle. Sometimes you read that you should scrape off the fertile topsoil before sprinkling wildflower seeds, but this seems kinda perverse to me.
Instead, I weed the dock and dandelion by hand. Both have pretty evil taproots. This is enough of a task with my small garden, I can't imagine doing it for a half-acre!
I have quite a lot of tree cover, so I planted native bulbs beneath the lawn which have done reasonably well (they mostly come up before the trees are in leaf). I basically don't mow these patches from February til June, as the bulbs are best left until they die back naturally to store up energy for the next year. I mow a winding path through the long grass which looks rather nice (mowing around the longer areas is a good general tip for making things look neat rather than neglected). Other than that, I seem to have clover, creeping buttercup, cow parsley and common vetch. I rather like the vetch so spread it around a bit. This year I've put in a few ox-eye daisy plugs, which I'm told I may live to regret, but we'll see.
Nice. I think we have to choose our battles, and even in our own property I think we may end up having to take a Tom Bombadil in the Old Forest approach.
The "explanation" confuses correlation with causation. It is true that during the pandemic, with two stimulus checks from Trump and one from Biden, average savings increased, and many companies saw that as an excuse to raise prices to try to drain that savings. But that savings was not evenly distributed, and the timeline doesn't quite work. It's a word-association fallacy.
So, what, you think that children should have to figure out computers for themselves? That only the geeky ones (who are likely to poke at stuff and learn that way) should actually have a clue how to use computers?
Or is it that you think parents should be teaching this—which guarantees it's only going to be taught to the children of the reasonably well-off?
Welcome to the 21st century, where it's widely beneficial for everyone to know how to use computers at least at a basic level, and skills like "how to use simple Excel formulas" are often enough on their own to get people decent-paying office jobs. Because the people already working there were never taught how to use computers in any systematic way beyond, maybe, "keyboarding classes".
Where many governments require forms of various types to be submitted online, and only put out many kinds of information on their websites.
Where it's nearly a guarantee that people are going to be using social media, no matter how much you try to restrict them, so it's a damn good idea to teach them how to recognize misinformation and other kinds of manipulation (not strictly a computer skill, but definitely related).
I certainly wouldn't argue with how anyone else raises their children, but I think it has been hugely beneficial to restrict access to technology early in my kids' lives. It can be introduced as they get older without worrying about them falling behind
I'm not talking about kids. I'm saying those kids will grow up, and once they grow up, they will almost certainly be on social media.
If they are not taught about critical thinking, media literacy, and how to recognize manipulations and misinformation while they are still in school, they will be easy prey for all the various kinds of sharks out there.
Most people that disagree (at least the ones that I talked with) assume that the computer will REPLACE traditional teaching: Books, drawing, pencil, paper.
Nevertheless promoters don't generally want to get rid of traditional teaching tools. Just add digital tools as part of the learning experience.
And contrary to most popular believe, kids don't want to be all the time with the computers.
My daughter is 8 and already has a chromebook at school. I freaked out when she started to search stuff on Google at home on the ipad. Need to look into parental controls.
I think most people (not devs/designers) view a laptop as something that costs around $750. That is why Windows still has a huge marketshare. Dell, Lenovo, Samsung, Microsoft all have laptops that have solid build quality, OLED screens, touchscreens, etc. at $1,500-2,200 price range but no one buys those. If you are in that price range, you are probably in one of those niche industries (design, development, graphics) and you are probably buying an Apple.
That view is so myopic - loads of people buy high end non-apple laptops. I would anecdotally guess many many more than apple overall. Just seems you're lacking in awareness of it. Probably a particular social bubble you are in, combined with myopia to the rest of the world.
I disagree. Buying Dell and Samsung laptop in that price range I'd say it is an overpriced rubbish if you compare to current Apple produce.
There is really no comparison when it comes to build quality and performance.
I have an XPS and I personally find it better than the M1 MBP in almost everything except power efficiency, so I have much difference experiences compared to your other review in this thread.
The XPS doesn't make much noise and I notice no difference in speed while developing. Some tasks are even faster, because native Docker instead of running via Docker for Mac. The battery doesn't last as long I guess, but it will still last for 8 hours, which is far longer than I need.
This is not my experience with XPS 15 (2019). It is substantially slower, it is loud and battery used to last less than an hour. Then constantly dying chargers, I had a box of Dell chargers that worked for a month and then laptop stopped recognising them.
For the tasks I do (I use Docker heavily) M1 is more than twice as fast in low power mode and never heard fans going off or felt any slowdowns.
On XPS if I opened a heavier webpage, it was possible for the entire laptop to slow down and not even refreshing the screen timely, so you witnessed a slideshow and usually only way out was to perform a hard reset.
Also random power off or if it goes to sleep it won't wake up. I have to then leave it for an hour and then maybe it will power on (sometimes I have to try a couple of times). This is actually the same experience I had with earlier XPS 13.
Google XPS won't power on - plenty of people have this problem.
I think Linux is still much better than MacOS in terms of design and usability, but that doesn't really mean much when operating systems compete on compatibility with programs (which themselves compete on compatibility with operating systems). It is a vicious cycle.
If your job requires you to run AutoCAD or VSC++ or something, you're just going to use windows. The average user isn't going to figure out how to use KVM or Wine or something. If your job requires some linux/unix tool, is the average user going to fidget with it until it works in MacOS or just use Ubuntu or something? MacOS is the worst of both worlds: it is both closed-source AND a minority OS.
I used a rolling release distro for a while on desktop and a NUC, it was really nice and convenient. But I switched over to Ubuntu for a laptop ("they'll sort out the touchscreen drivers and onscreen keyboard situation" I told myself), now and I kinda regret telling people to "just use Ubuntu or something" in the past.
It worked when I first installed it, until quite recently, when a new version hit. Upgrading every package at the same time is obviously destabilizing, something has changed in the plumbing and under certain circumstances some gtk programs require a 30 second timeout to occur before they start, and there's the whole snap firefox debacle. Longing for the stability of rolling release, oddly enough.
Anyway, I haven't used MacOS, but I've generally been surprised to find that my current system is hovering around near-Windows level usability, other than the familiar terminal which is nice. Probably time to try out tumbleweed...
I think it is Surface envy. The iPads are just not very capable in productivity scenarios and Macs are not very flexible. So, if you want a touchscreen device for leisure and a productive device work work, you end up with two machines with very similar specs on the inside. It does not make a whole lot of sense, but it is the state of the Apple ecosystem. Now, Microsoft has a whole bunch of other problems (mainly around reliability) that make the Surface a bit of a mess. But, they do seem to have the right form factor for modern computing.