Hacker News new | past | comments | ask | show | jobs | submit | more chatman's comments login

Will this work well with NVidia drivers?


Short answer; no.

The video is using intel's onboard vga chipset which plays extra nice with intel's onboard EFI. You can bet the framebuffer code between the EFI and 3rd party hardware(AMD/NVidia) sucks much, much worse

I expect ServerFault posts about grephic instability in fedora 29 any minute now...


Not yet according to his post. Only on Intel graphics.


The decline has begun. It will soon be time for Intel to shut shop.


I mean, they missed a few trends and are having some scaling issues on their next generation of transistors but they made like $60 billion in revenue last year and the company is worth ~$240 billion. I think they're going to be around for a few more years.


The taller they stand, the harder they fall.


No, actually. In tech, the typical path of a failed giant is slow decline for a while, a few wild gyrations, and finally patent trolling and brand necrophilia.


I am aware AMD has been making great strides lately, but you have to remember that AMD has a decade-long hole to dig themselves out of.

The only game in town has been Intel for almost 12 years now. When the Core lineup was released in 2006 [1], the wind quickly went out of AMD's sails. [2] AMD had almost matched Intel in market share, they just need to keep the innovation coming. The problem was Bulldozer and similar architectures just weren't competitive. Even with double the core count, Intel's chips were simply better for the vast majority of workloads. [3]

The CPU wars had gotten so one-sided that I believe Intel stopped pushing. With the exception of Gulftown [4], four cores was the name of the game from Kentsfield in 2006 [5] until Kaby Lake in 2017 [6]. Intel released it's first desktop dual-core processor on April 16, 2005 [7]. Quad-core Intel chips came January 8, 2007 [8]. Less than two years led to an extra two cores in the consumer level CPU market, and that was the end for over 10 years.

When the Zen architecture was release to consumers on Febuary 28, 2017 [9], Intel checked under the couch and magically found two extra cores. Coffee Lake was released on October 5, 2017 [10]. Funny how real competition changed a decade of stagnation, isn't it? If Zen didn't do as well as it did, I can see the 8th generation chips staying quad-core.

The supercomputer market is also dominated by Intel. I count two (2) supercomputers using Opterons [11]. Everything else is POWER or Xeon. All Apple computers use Intel CPUs.

Ryzen, Threadripper and EPYC are great products and offer a staggering increase in performance over what the previous generation chips. It's far from "game over".

1: https://en.wikipedia.org/wiki/Intel_Core_2#Models

2: https://www.cpubenchmark.net/market_share.html

3: https://www.hardocp.com/article/2011/10/11/amd_bulldozer_fx8...

4: https://en.wikipedia.org/wiki/List_of_Intel_Core_i7_micropro...

5: https://en.wikipedia.org/wiki/Kentsfield_(microprocessor)

6: https://en.wikipedia.org/wiki/Kaby_Lake

7: https://en.wikipedia.org/wiki/Pentium_D#Smithfield

8: https://en.wikipedia.org/wiki/Kentsfield_(microprocessor)

9: https://www.mobipicker.com/amd-ryzen-7-1800x-1700x-1700-rele...

10: https://en.wikipedia.org/wiki/Coffee_Lake 11: https://www.top500.org/list/2018/06/


And, they made fun of RMS... He was telling you what the future holds. This is just a trailer of what is to come.


I agree, it is really unfortunate that even people within the software development profession take these issues so lightly.


I don't think it's so much that software devs take it "lightly". In my experience as a infosec consultant, the bigger problem is that most software devs are too cocky when it comes to security. Most think that security is just a subdomain of computer science (it is not!), and that because they took a crypto class in college, they are 100% qualified to handle the security themselves. They think they are taking it seriously, but they don't understand that knowing how to write software does not make you an expert in securing software.

Most devs don't seem to acknowledge that good security requires having a separate, dedicated person/team to handle it, just like how you would hire a lawyer rather than having your software devs handle legal issues.

I once posted on HN that every company that deals with sensitive data, big or small, must have a dedicated security person/team. My comment was downvoted/flagged, and I was bombarded with responses like "why would we waste the money on a security person? my dev team already knows to encrypt passwords".


This. I worked with an end-to-end encrypted communications company for 5 years, and learned a vast amount more about crypto, attack vectors, and security holes than I did in the previous decade or two, but I would never claim to be a security or crypto expert, or even competent at it.

In fact, I almost certainly know only a tiny fraction of what the actual experts in that company knew, but a number of people have told me that I know a lot more about it than the average developer.

That scares me, and if people flame someone for recommending that a dedicated security expert be hired by companies that handle sensitive data, I can only conclude it is out of ignorance - of what's out there, and what's possible.

On the other hand, there are economic realities to consider, especially in early-stage, underfunded startups. What do they do about this?


Where can mere mortals get an overview of just what you know? A lay of the land, scope, just to frame up what these problems really look like.

Its hard to even think about these things for those of us working at low levels, firmware, embedded, etc...

Your comment got me to thinking about what I don't know. Which is a whole lot.


I think you're missing the point. It's not about security, it's about data.


Ah, you're right. I skimmed over the top level comment and missed that. My bad.


Information wants to be free. That includes information you don't want to be free.


    Information wants to be free
    Unless it is about me


Where do I sign?


And? ...


They sell untested, unapproved products. It is a govt backed corporation.


Next, they will detain/arrest Kodi developers at airports.


FreeBSD community has banned "virtual hugs" in their community guidelines!


To save someone a bit of googling here's a link:

https://www.theregister.co.uk/2018/02/21/freebsd_code_of_con...


They've banned harassment (while falling into the common trap of requiring "consent" without defining the term in a way that doesn't rely on time-travel).

> Physical contact and simulated physical contact (e.g., textual descriptions like "hug" or "backrub") without consent or after a request to stop.


Discarding the "full stack in a box" idea, in general, just because of past experience with a poorly implemented vagrant setup seems naive. A good "full stack in a box" implementation (docker-compose, swarm, kubernetes etc.) can be a useful tool in testing/developing microservices.


Well there is obviously a point at which it will no longer work (way before Google scale). But the point is to stop testing services in a coupled way, the whole point of microservices is to make decoupling real, not to build a distributed monolith. Testing in a decoupled way helps this enormously.


Yeah, excepts it doesn't. In those microservices setup more than not the change in the data to accomplish a client request is 3 or more services of "distance". Still, it needs to be accomplished. If you don't coordinate all the services involved (all with different developers team, maybe from different contractors) AND if you don't do an end-to-end test, how can you be sure to do the requested change?


> In those microservices setup more than not the change in the data to accomplish a client request is 3 or more services of "distance". Still, it needs to be accomplished. If you don't coordinate all the services involved (all with different developers team, maybe from different contractors) AND if you don't do an end-to-end test, how can you be sure to do the requested change?

You're doing it wrong. If these services are really so deeply entangled that you can't change and test them one at a time, they shouldn't be independent services. Merge them, or otherwise rethink your service boundaries.


You can do these tests against a UAT environment. It doesn't have to run on your own box.

In my apps I use fakes in dev and test mode, which makes development very fast and easy. A few tests run against the actual UAT environment, but these are skipped unless a command line flag is passed.

Mostly it's inspired from the article "Mocks and explicit contracts": http://blog.plataformatec.com.br/2015/10/mocks-and-explicit-...


Not as exciting as Solr 7 release.


I've been using solr since 1.3 and found trying to use elastic is pretty crazy annoying. I'd love to swap the ELK stack for a solar-based setup.


lucidworks made banana, a kibana port to solr, but never take off like kibana. I dont find the diferences between solr and elasticsearch so big, and always prefered the solr way of configurating things instead of the json api of elasticsearch. The solr data-import-handler has been also a big timesaver.


Lol, people from Citi and ING are cited as experts. ;-) Big financial corporations are threatened by Bitcoins.


Its like people from energy companies being interviewed as experts on solar energy.


It is back down now. Roger Ver is artificially pumping up the price of BCH. BCH is a trash coin.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: