Instead of thinking of a bet as saying "I have good cards" think of it instead as "I have an advantage in this pot", which is not a lie.
In poker advantages can come from cards, or from other objective measures such as position, stack size. And of course from subjective measures like being able to read your opponent.
I feel one of the most useful skills picked up by poker that people don't explicitly speak about is managing your information effectively.
Deceiving my opponent has the connotation of this happening in one instance. After you realize that you can't convincingly deceive your opponents in poker into perpetuity, it becomes a game of managing your image —revealing the right information while being conscious of information that you shared in the past (if you're playing someone skilled or perceptive, that is).
On the flip side, what an excellent game to help people pay attention to signals, figure out how to weigh them appropriately, and appropriately act on them when the situation calls for it.
My claim is a bit stronger, not only can you play without lying, but you don't sacrifice anything, you can play at top level without lying, and you gain no advantage by lying. In essence at optimal play you ignore whatever your opponent says, there is only the bets and game actions, which are independent from the cards held.
The original claim is that people misconceive "that poker is about lying or that you need to lie to play poker"
The claim is not that deception can be used as a strategy at all. That btw is actually an uninteresting claim. In almost all games, you can lie to your opponent and probably gain some advantage.
If I were coaching a beginner poker player, I would honestly tell them to play statistically sound poker. That's a good way to make a lot of money.
By posture do you mean act verbally and physically? Or bet as if you had a good hand?
The first is mostly inconsequential in poker, you should avoid having tells in your posture and speak, but the goal is to avoid conveying information about your hand, not conveying false information about it to deceive.
The second is just the game itself, acting as if you had strong cards has a cost, and is not lying, when you bet you are not saying "I have a hand". In a sense you may bet with a bad hand, but you are more forcing your opponent to pay for a chance to win the pot on account of your hand potentially containing a strong hand. You are truthful in your representation of a potential strong card.
In fact if you were to bluff on a situation were you could not ever have held a strong hand, it would be a mistake, and you would stand to lose expected value.
Yeah, it's basically the main thing I was taught to do to avoid any chance encounters with (animal) predators growing up: walk confidently and present as a fellow predator and not a prey animal.
Definitely sounds like a plausible and fun episode.
On the other hand, real history if filled with all sorts of things being treated as a god that were much worse than "unreliable computer". For example, a lot of times it's just a human with malice.
It’s important not to confuse entertainment with a serious understanding of the consequences of systems. For example, Asimov’s three rules are great narrative tools because they’re easy for everyone to understand and provide great fodder for creatively figuring out how to violate those rules. They in no way inform you about the practical issues of building robots from an ethical perspective nor in understanding the real failure modes of robots. Same with philosophy and self driving cars - everyone brings up the trolley problem which turns out to be a non issue because robotic cars avoid the trolley problem way in advance and just try to lower the energy in the system as quickly as possible vs trying to solve the ethics.
Yes. This is a component of media literacy that has been melted away by the "magic technology" marketing of the 2000s. It's important for people to treat these stories with allegorical white-gloves rather than interpreting them literally.
Gene Roddenbury knew this, and it's kinda why the original Trek was so entertaining. The juxtaposition of super-technology and interpersonal conflict was a lot more novel in the 60s than it is in a post-internet world, and therefore used to be easier to understand as a literary device. To a modern audience, a Tricorder is indistinguishable from an iPhone; the fancy "hailing channel" is indistinct from Skype or Facetime.
Doesn’t apply. Disease is a societal group problem. Part of the social contract of living in that society is vaccination. You don’t have to get vaccinated but you then don’t get to enjoy the privileges of living with others in the community.
This isn’t anything like the trolley problem. And yes, taking actions has consequences intended or otherwise. That’s not the trolley problem either
"Ms. Sackett, with the aid of film clips, said that "The Return of the Archons," from the original series, was a good example of how Mr. Roddenberry employed elements of humanism in his works.
In that episode, a planet's population follows, in a zombie-like manner, a mysterious cult-like leader, who allows no divergent viewpoints.
The society absorbs individuals into its collective body and the world is free of hate, conflict and crime but creativity, freedom and individualism are stifled.
Ms. Sackett said that "Archons," like other Star Trek storylines, warns how people can be controlled by religion. In the end, the viewer discovers the cult leader is actually a computer."
"[N]o divergent viewpoints" sounds like Stackoverflow and forums run by software developers in general. The behaviour of "developers" can be extremelly cult-like.
Creativity, i.e., new work that is not comprised of a recombination of old work, does not seem compatible with "AI". The later relies on patterns found in old work.
Look at the British Post Office scandal - "the computer is always right".
Say what you will about a human, but unless you're a religious zealot or blind you generally don't believe the leader to be infallible. But through the magic of silicon you can shut people up more effectively.
This makes computers an accelerator of the problem, and therefore warrants caution any time their output may be relied upon for life and death decisions.
>Even in a lively discussion it was not compatible with Article 10 of the Convention to pack incriminating statements into the wrapping of an otherwise acceptable expression of opinion and claim that this rendered passable those statements exceeding the permissible limits of freedom of expression.
Although the expression of this opinion is otherwise acceptable, it was packed with "incriminating statements". But the subject of these incriminating statements is 2000 year old mythical figure.
At the time, did you think the quality of that DVD was about the same as the experience you got in the theater?
The parent post is arguing that the gap in experience between home theaters and theater theaters has shrunk immensely. Right now I have a 85" wide OLED in my living room - That's not a thing that existed in 2002
No, but it was good enough for most movies. The person you replied to is correct: It was glorious at the time. We were all amazed by graphics, even on those old tvs. The "movie theater experience" wasn't worth the hassle for anything but movies with good action and graphics - things like comedies didn't get uniquely better at the theater.
It didn't need to be about the same or better, it just needed to be good enough to appreciate that you weren't dealing with the downsides. The theaters weren't that good back in the late 90's (in fact, most of the ones I visited in my teens have renovated to be more current sometime around 2010 or something). All people needed was more realistic alternatives. More and more folks were getting cable, DVD players were more affordable, and places like walmart sold DVDs for a cheaper price than you'd pay for a full price movie. Netflix started in the late 90s too.
Yes, I know folks could rent videos before this. I remember walking down to rent NES games when I was young - right next to the movies at the grocery store. This was a far cry from the stores of the late 90s, though. They got better (and worse).
> No, but it was good enough for most movies. The person you replied to is correct: It was glorious at the time. We were all amazed by graphics, even on those old tvs.
I genuinely don't know what you're talking about. No it wasn't.
Movies on TV weren't glorious at all. They weren't "amazing." They were what you made do with. And when a classic movie played at your local arthouse theater you grabbed a ticket because it was so much better. The image quality. The sound. Seeing the whole image rather than a bunch of it hacked off.
That's why we went to the theater. Not just for action. For comedies too. Which is why comedies made tons of money at the theater!
And while maybe not amazing, they were wonderful at the time. Do you remember folks being amazed at the graphics on the PS1 or heck, even the N64? Those weren't good, really, but at the time? Yeah, it was good enough. Late 90's started seeing bigger tvs and sound systems. DVDs brought options to see all that stuff on the sides if you wanted. You didn't "make do" if you are out there buying modern tech - maybe you made do with whatever brands they sold at walmart - but then again, they sold game systems so it really wasn't "store brands only" or anything like that.
I'm not sure where you lived, but absolutely no theaters around me showed classics, save for something like Star Wars or Disney movies. One played second run movies - the ones in the space between theater release and home video release. So no, no one went to things like that. I graduated high school in the mid 90s, and both local theaters were pretty run down places that treated employees badly. The ones in smaller surrounding towns, if they had it, usually only played a handful of movies and were old places. And this seemed pretty normal for Indiana, outside of perhaps Indianapolis or a few richer areas.
I’ll chime in as a grey beard. Did we think the DVD was the same as being at the theater? It really depends on who your friends were. Some of us kids had techie parents that had things like VGA projectors for presentations. We would take these and play DVD’s off our full-tower Pentium 3’s at movie theater-like experiences. I fondly remember watching the Matrix bonus content with my friends over a giant 100ft wall.
Are you saying you'd order raw quality differently than:
2002 TV setup < 2022 TV setup < movie theater
Or are you just saying that a home TV setup is still not as good as a movie theater? The point for the latter was the delta between home and theater used to be much larger, not that the delta is now 0, hence a decrease in theater ticket sales would make sense even if people were watching more movies. If the former, what order do you see it and what leads you to order them in the way you do?
No, what I mean is the 2002 experience was awesome for us in our time, like the 2022 is for people of today. But both experiences still pale compared with a movie teather. It's like for us at the time the DVD was a 8, modern TV is maybe a 10, but movie rather in both cases was like a 10000. It was, and it is, in a complete another level.
The big difference maker imo in movie theater experience is size and sound. You still need to drop about the same few thousand dollars you had to drop in 2002 to buy a proper projector and sound system today. 85 inch low pixel density screen and a sound bar ain't it, but if it is it for you, you are probably no discerning audiophile who would have probably have been fine with whatever was sold in a comparable market segment in 2002 (refrigerator width crt displays were in fact all the rage and very desirable at one point).
You can drop about $800 on a great 1080p projector, screen, and a pair of AirPods that will give you better surround sound than most speaker systems will give you.
My projector screen takes up more of my vision than any movie theater screen I've ever seen except IMAX.
I'm sorry but airpods and a 1080p screen from your couch are on a different planet compared to theater sound and even liemax or smaller formats. You can't feel sound from an airpod in your chest.
It's neither of these options in this false dichotomy.
100M people signed up and did at least 1 task. Then, most likely some % of them discovered it was a useful thing (if for nothing else than just to make more memes), and converted into a MAU.
If I had to use my intuition, I would say it's 5% - 10%, which represents a larger product launch than most developers will ever participate in, in the context of a single day.
Of course the ongoing stickiness of the MAU also depends on the ability of this particular tool to stay on top amongst increasing competition.
Apparently OpenAI is losing money like crazy on this and their conversion rates to paid are abysmal, even for the cheaper licenses. And not even their top subscription covers its cost.
Uber at a 10x scale.
I should add that compared to the hype, at a global level Uber is a failure. Yes, it's still a big company, yes, it's profitable now, but I think it was launched 10+ years ago and it's barely becoming net profitabile over it's existence now and shows no signs of taking over the world. Sure, it's big in the US and a few specific markets. But elsewhere it's either banned for undermining labor practices or has stiff local competition or it's just not cost competitive and it won't enter the market because without the whole "gig economy" scam it's just a regular taxi company with a better app.
It's quite hard to say for sure, and I will prefix my comment by saying his blog posts are very long and quite doomerist about LLMs, but he makes a decent case about OpenAI financials:
A very solid argument is like that against propaganda: it's not so much about what is being said but what about isn't. OpenAI is basically shouting about every minor achievement from the rooftops so the fact that they are remarkably silent about financial fundamentals says something. At best something mediocre or more likely bad.
All very fair caveats/heads up about Ed Zitron, but just for context for others: he is an actual journalist that has been in the tech space for a long time, and has been critical of lots of large figures in tech for a long time. He has a cohesive thesis around the tech industry, so his thoughts on AI/LLMs aren't out of nowhere and disconnected.
Basically, it's one of those things you may read and find that, all things considered, you don't agree with the conclusions, but there's real substance there and you'll probably benefit from reading a few of his articles.
For an increasing plurality (possibly even majority at this point) of sites where the purpose is not purely to read text, this is effectively equivalent to saying "you can just not use the site."
You can go back to university.
reply