Perhaps, but that's an entirely different thing from having items from 2022-2023 in its training data.
That part wouldn't be any different with a human. If you locked a person in a cave in 2021, and released him in 2023, he could know that it was 2023 (at least after you told him) without knowing (e.g.) who won the World Series in 2022.
The criticism wasn’t that ChatGPT doesn’t know stuff from 2022, it was that ChatGPT is stating that it can’t tell you because it hasn’t happened yet.
I don’t know if this is because of boilerplate formulations about the cutoff date that aren’t true ChatGPT output, or rather because ChatGPT has a poor understanding of time, as examples shared in previous HN threads have demonstrated.
A more straightforward answer would have been “My training data only goes up until 2021, so I don’t know who is president in 2022.”
However, the answer could be more useful. When you actually ask it “Is Biden president?”, ChatGPT answers “As of my knowledge cutoff in 2021, Joe Biden is the president of the United States, after being inaugurated on January 20, 2021.” So why didn’t it give that information for the first question?
It is a valid criticism that ChatGPT is very inconsistent and often misleading in its representation of what information it is in principle able to provide.
ChatGPT: Today's date is January 14th, 2023
ChatGPT clearly knows the current date and year.