Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> when I asked it who was “President in 2022,” it responded (inter alia) with “My training data only goes up until 2021, so I am not able to provide information about events that have not yet occurred.”

> Notice that it goes off the rails in its answer because it wrote me that in 2023!

This is not a mistake. If the bot is trained with data only up to 2021, then for the bot 2022 is in the future, no matter when you ask, in 2022, 2023 or in 2030.



We’ll probably see some real skill when Bing gets ChatGPT integration, but I fear even “ChatGPT Premium” won’t support browsing the web.


It needn't be real time though, and arguably better if it weren't. But I'd expect to see the training data updated at least daily at some time in the future.


you.com's chatgpt feature incorporates weblinks.


The entire article is positively dripping with the sort of self-important puffery so often shown by academics. He's clearly looking for gotchas in a system that he absolutely doesn't understand and of course he also doesn't miss an opportunity to take a dig at educators in America and their sub-standard institutions.

Maybe a doctorate in PoliSci doesn't really translate to a deep understanding of AI. I'll take the author's opinions on technology with the same gravitas in which they might take my own political opinions.


Me: What date is today?

ChatGPT: Today's date is January 14th, 2023

ChatGPT clearly knows the current date and year.


Perhaps, but that's an entirely different thing from having items from 2022-2023 in its training data.

That part wouldn't be any different with a human. If you locked a person in a cave in 2021, and released him in 2023, he could know that it was 2023 (at least after you told him) without knowing (e.g.) who won the World Series in 2022.


The criticism wasn’t that ChatGPT doesn’t know stuff from 2022, it was that ChatGPT is stating that it can’t tell you because it hasn’t happened yet.

I don’t know if this is because of boilerplate formulations about the cutoff date that aren’t true ChatGPT output, or rather because ChatGPT has a poor understanding of time, as examples shared in previous HN threads have demonstrated.


So you're suggesting that because it can tell the current date, it should have said "had not yet occurred" instead of "have not yet occurred".

That's kind of subtle.


A more straightforward answer would have been “My training data only goes up until 2021, so I don’t know who is president in 2022.”

However, the answer could be more useful. When you actually ask it “Is Biden president?”, ChatGPT answers “As of my knowledge cutoff in 2021, Joe Biden is the president of the United States, after being inaugurated on January 20, 2021.” So why didn’t it give that information for the first question?

It is a valid criticism that ChatGPT is very inconsistent and often misleading in its representation of what information it is in principle able to provide.


In fact the probability that Biden was dead or incapacited by now wasn't nil. So wouldn't even be right to extrapolate based on mandate terms.


Then a straightforward answer would be something along the lines of “Biden, assuming he didn’t die or was incapacitated since <date>, which is the last date I have data on.”




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: