Hacker News new | past | comments | ask | show | jobs | submit login

30 and 40 series only? My 2080 Ti scoffs at the artificial limitation



so they branded this "Chat with RTX", using the RTX branding. Which, originally, meant "ray tracing". And the full title of your 2080 Ti is the "RTX 2080 Ti".

So, reviewing this...

- they are associating AI with RTX (ray tracing) now (??)

- your RTX card cannot chat with RTX (???)

wat


The marketing whiff on ray tracing happened long ago. DLSS is the killer app on RTX cards, another 'AI'-enabled workload.


No support for bf16 in a card that was released more than 5 years ago, I guess? Support starts with Ampere?

Although you’d realistically need 5-6 bit quantization to get anything large/usable enough running on a 12GB card. And I think it’s just CUDA then, so you should be able to use 2080 Ti.


That was my first question, does it display pretty ray traced images instead of answers?


RTX is a brand more than ray tracing.

It is largely an arbitrary generational limit


> I pull my PC with Intel 8086 out of closet

> I try to run windows 10 on it

> It doesn't work

> pff, Intel cpu cannot run OS meant for intel CPUs

wat

Jokes aside, nvidia been using RTX branding for products that use Tensor Cores for a long-time now. Limitation due to 1st gen tensor cores not supporting precisions required.


Yeah, seems a bit odd because the TensorRT-LLM repo lists Turing as supported architecture.

https://github.com/NVIDIA/TensorRT-LLM?tab=readme-ov-file#pr...


I, too, was hoping that my 2080 Ti from 2019 would suffice. =(


Don't worry, they'll be happy to charge you $750 for an entry level card next generation that can run this.



Yes peasants, Nvidia requires you to buy the latest and greatest expensive luxury gear, and you will BEG for it.


You can use an older 3-series card. No latest & greatest required.


A 4060 8gb is $300.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: