I would argue that just being in the office or not using AI doesn't guarantee any better learning of younger generations. Without proper guidance a junior would still struggle regardless of their location or AI pilot.
The challenge now is for companies, managers and mentors to adapt to more remote and AI assisted learning. If a junior can be taught that it's okay to reach out (and be given ample opportunities to do so), as well as how to productively use AI to explain concepts that they may feel too scared to ask because they're "basics", then I don't see why this would hurt in the long run.
Fully agree. I would also say it's easy enough to use Django for (almost) everything for a self contained SaaS startup. Marketing can be done via Wagtail. Support is managed by a reusable app that is a simple static element on every page (similar to Intercom) that redirects to a standard Django page, collects some info about the issue including the user who made it (if authenticated) etc.
I try to simplify the stack further and use SQLite with Borg for backups. Caching leverages Diskcache.
Deployment is slightly more complicated. I use containers and podman with systemd but could easily be a git pull & gunicorn restart.
My frontend practices have gone through some cycles. I found Alpine & HTMX too restrictive to my liking and instead prefer to use Typescript with django-vite integration. Yes it means using some of the frontend tooling but it means I can use TailwindCSS, React, Typescript etc if I want.
Any info on context length or comparable performance? Press release is unfortunately lacking on technical details.
Also I'm curious if there was any reason to make such a PR without actually releasing the model (due Summer)? What's the delay? Or rather what was the motivation for a PR?
Amateur question, how are people using this for coding?
Direct chat and copy pasting code? Seems clunky.
Or manually switching in cursor? Although is extra cost and not required for a lot of tasks where Cursor tab is faster and good enough. So need to opt in on demand.
"Responsible capitalism" is not a term you often hear but it's something I think we could strive for.
Not-for-large-profit is a set of guiding principals that tries to sit in the middle of the non-profit and profit-at-all-costs spectrum.
This is my initial attempt at an idea I've had floating around for a while and I would love to see feedback either in comments here or discussions/PRs on the GitHub repo.
I am evaluating a new machine to replace my laptop & desktop for coding (Python, Go, React), data exploration, running containers and low resource business day-to-day (Microsoft Suite, Slack, light image editing etc.) - not necessarily all the same time but often.. I travel a medium amount so can't just use a desktop but running both seems cumbersome.
The M4 MacBook Airs are very tempting and I think the size & weight of the 15" is not as offputting as it once was. However I agree with the criticism in this article. A lesser quality display and lacking a little power (M4 Pro option would be nice).
A 14" MacBook Pro is the current draw. Slightly heavier but option for M4 Pro as well as more memory (up to 48GB) as well as nanotexture for out and about.
I love the idea of the Framework 13" machines with Ubuntu. Almost same weight as a 13" Air and with strong upgradeability. Disadvantages (to me) are the battery life on Linux is significantly less than an Apple device (although hard to find exact numbers), and even with the new Ryzen AI Max processors and the DDR5 memory, speed is much lower compared to the M-series soldered on a chip (although I'm open to counter points that this difference in speed is not worth it).
The Apple software ecosystem is a soft grab but to be honest there are options to Apple Photos which is the one I use the most.
> I love the idea of the Framework 13" machines with Ubuntu. Almost same weight as a 13" Air and with strong upgradeability. Disadvantages (to me) are the battery life on Linux is significantly less than an Apple device (although hard to find exact numbers
As one battery life datapoint, I have a Framework with the previous gen AMD board and the 61wh battery, and in Linux Mint with no special configuration I get about 7 or 8 hours in normal use (with wifi+BT+average backlight, just doing normal browsing/file editing, not maxing out CPU rebuilding a massive project for hours). That's totally fine for my needs, it's effectively a full workday or an intercontinental airport + flight without power. I'm very happy with it and the upgradeability has been great (I upgraded mainboard, and now having the old mainboard running as a home server).
Lacking power? Below buying the very best Intel or AMD CPU which needs a new mainboard and RAM, I seem to never get any PC upgrades in the region of even an M3.
For mobile, this kind of performance is insane. I usually am happy if it is not a netbook CPU, since my 4th gen i5 dualcore+HT is still up to anything I want to do with it.
That’s something that I’m puzzled about. If we take a look at both PC and Mac Geekbench results which use the exact same Intel CPU as a baseline, Macs wipe the floor with any Intel or AMD professor. Am I reading it wrong or?
I am not suggesting that Geekbench is intentionally favoring Apple Silicon. The code the benchmark runs just happens to run better on Apple Silicon. And Geekbench also poorly scales with higher core counts so multicore scores are almost useless.
IMO with benchmarks it's never a good idea to rely on a single score. You should compare many different scores. Preferably with benchmarks that test workloads you will actually make use of.
But even then, what benchmarks contradict it? You’re claiming an inherent bias, but other benchmarks also run just as well on Apple Silicon when normalized for core count. Cinebench, Specs etc…
That sounds about right. Of course depends on the workload, but e.g. when compiling Rust code, having a Apple M-series chip makes a huge difference. That alone would make it hard for me to consider switching to any non-Apple laptop.
From personal experience, I'd also say that there is a noticeable bump in performance between M Max and M Pro chips (= running same workload as colleagues on identical specs apart from the chip), that isn't really apparent in the benchmarks here.
No, the bench results for Apple laptops are accurate (since there is close to no variance in hw config). It is the x64 processors that appear worse than they actually perform.
Might be an issue with the test methodology for x64 processors, eg. cooling and RAM speeds, SSD speeds etc. You should run geekbench on your own machine. C++ and C project compilation benchmarks I ran more or less match what I saw with Geekbench (in terms of %diff).
Basically, what I got is as follows (fully spec'd M4 MBA vs Ryzen 5900X, WSL2):
* for multi-core, my M4 Air is only 25% slower than my Ryzen 5900x desktop
* for single-core, my M4 Air is 0-20% faster than my 5900x
Compared to the M1 Air, it is almost exactly 2x as fast (Apple's claims are true in this regard), has slightly brighter display, and much better speakers. The "2x as fast" is sometimes actually "4x+ as fast" (for some workloads) because you have to take thermal throttling into account.
I just went through this process and ended up getting the HP Omnibook Ultra Flip with a core 7 258v, running Ubuntu 24.10. Performance is excellent, battery life is the best I've ever had on an Intel running Linux. If you truly need more RAM than 32 GB it won't be an option though.
The air or an mb pro seems so nice until I remember the sting of dealing with the dev environment and docker on a Mac compared to Linux. No amount of battery life or marginal jump in performance (which gets lost through needed virtualization) will make up for that for me.
It does seem quite nice. Is the pen input decent?
If the build quality is OK the pricing is quite good, in France with 32Go/1To is 650€ cheaper than an equivalent MacBook Air.
I have not looked for a proper CPU comparison but I don't that it matters that much for a device of this type, and as you said, with Apple Silicon you lose a lot of the performance once you venture away from optimized software (which is still quite restrictive).
Considering the (relative) failure of the ARM push for Windows PCs I doubt ARM is going anywhere in the consumer market (that probably means it will stay niche in the server market as well but we'll see).
> even with the new Ryzen AI Max processors and the DDR5 memory, speed is much lower compared to the M-series soldered on a chip
Are you referring to memory bandwidth specifically? Yes, that difference is not worth it. Even if you fully load the CPU and GPU on Apple Silicon you will not come close to using all that bandwidth [0]. You're better off comparing real-world benchmarks instead.
Sometimes they just don't allow you to use it. In the past I've had ISP router that had a heavily restricted custom firmware on it and a "hidden" username password setup for authorizing with the ISP. I couldn't use my own.
In that situation I had to aim to use it as the modem and have a second router it unloaded to. Not ideal.
Now I can freely pick hardware with my current ISP. Just need to find the time/money to upgrade to fibre everywhere to capitalize on the 10Gb/s.
Who are you doing this for? All the small children on HN who will be scandalized reading the word "porn" but are safe so long as the "o" is censored with an asterisk?
The obvious answer is that this user likely also comments on other platforms that do not allow using the word openly and instead of the mental overload of constantly deciding which site does and doesn’t allow spelling out the word they choose the safer option in every case.
Or their autocorrect has now learnt to respell it anyways.
Or, they didn’t really spend hours debating the pros and cons of using either spelling and just went with whatever felt ok at the time.
But why choose generosity when one can choose snarky ass instead.
Exactly. I'm applying societal pressure in the form of bullying in order to discourage language from being shaped by the profit motives of advertising agencies that push these puritanical cancers into the public lexicon.
I have no issue with the youth coming up with new slang or changing the definitions of words that are no longer as relevant as they once were. As you say, language evolves like that. But when the changes to language are being top-down forced by megacorps so they can more effectively target ads to children, that's when I want them burned in fire. I'm honestly surprised at just how justified I feel in being mean to the people who propagate this bullshit.
Have you considered that your outrage (justified or not) may be damaging to your own well being? I do believe that in some causes this is an acceptable trade but i also try to be accepting of things that don’t seem to cause (too much) harm. For me, this is one of them.
Aside, do you also get angry about the use of the words coke, hoover, sellotape, etc?
Interesting. There was adhesive tape before sellotape captured its place in the general lexicon, i guess the difference is that it’s use was not hard forced (censoring) and instead soft forced (advertising)?
Why do you follow your question by a refutation of the supposed response ? They probably doing it for themselves like people saying « gosh ». If you find that ridicule fine, but no need to strawman their intentions. You chances to get an honest, non defensive response are pretty low.
The challenge now is for companies, managers and mentors to adapt to more remote and AI assisted learning. If a junior can be taught that it's okay to reach out (and be given ample opportunities to do so), as well as how to productively use AI to explain concepts that they may feel too scared to ask because they're "basics", then I don't see why this would hurt in the long run.
reply