Well the Madoff trustee's were able to see who received payouts from Madoff due to KYC rules. This allowed the trustee to go to people, most of whom were completely innocent, to ask the to give back some of the money they redeemed from Madoff.
FTX on the other hand has very few KYC docs so the trustees are left with wallet addresses.
Not sure how you contact a wallet address to ask for money back.
Yes the real dollars went somewhere. I would guess the majority of them were lost from poor trades by Alameda. Combine that with the fact that a lot of those real dollars were converted in to Crypto schemes which are now down 90%. And their insane spend on houses and politics and who knows what else. I’m waiting anxiously to see what the truth is once we get some real data
the leading zeros are a sign that the poster is unusually short-sighted and only concerned about the next 98,000 years or so in their fixed-size date representations.
before i get tarred and feathered, i'm just going for a cheap ironic laugh here, no offense intended! :) more seriously, i sorta doubt humanity is going to be around long enough to care, and if we are, the odds of some apocalypse resetting our calendars aren't zero.
During my first week in my first programming job I remember overhearing a senior engineer saying “don’t worry, there’s not another leap year for 2 years”. I was still working there on February 29, he wasn’t. It was a stressful day. My condolences go out to the engineers who will be working on January 1st 100000.
My hope is that some programmer dictator will instate a form of the discordian calender:
- All 5 seasons are the same length
- The days of the year always fall on the same week day, as the week is 5 days long
- the year always starts with the first day of the week.
- the leap day is not assigned a day
Things I would change though:
- start the year at winter solstice.
- put the leap day at the end of the year.
You can rename the months any time you feel like. The problem is getting other people to go along with your renaming.
Once upon a time, enough people cared what the Pope had to say, he could actually succeed in changing the calendar for everybody. But, it would be a rather interesting future history for the Catholic Church to regain power to the point that became true once again.
I understand Madoff's scheme started in 1960's/1970's. The feeders to the fund date way back. The SEC enforcement action in the early 90's wasn't directly related to the ponzi scheme, more aimed at the feeders. The wiki pedia article has this "Federal investigators believe the fraud in the investment management division and advisory division may have begun in the 1970s.[27][failed verification] However, Madoff himself stated his fraudulent activities began in the 1990s.[28] ". I would trust the Feds. https://en.wikipedia.org/wiki/Madoff_investment_scandal
Never made sense how any software would consider the year value as _two characters_ when an integer works far better. There's even a 0 epoch date for crying out loud.
Possibly, the origins of the practice predates computers entirely, and goes back to their predecessors in business data processing, unit record equipment. As such, it may be like many other traditions - made complete sense when it was invented, yet people still clung to it long after it ceased to do so.
Unit record equipment generally didn’t use binary integers, instead using BCD; and most early business-oriented computers followed their example. Binary integers were mainly found on scientific machines (along with floating point), until the mid-1960s, when the hard division between business computing and scientific computing faded away, and general purpose architectures, equally capable of commerce and science, began to flourish.
Because when the Y2K problem was created, computers had only recently been invented, every byte counted, and there was no history of software best practices.
Even by 1980, which was 5 years before any recorded mention of the Y2K problem, an IBM mainframe, of the kind a university might have, had only e.g. 4 MB of main memory, which had to support many concurrent users.
Such computers would have been unable to load even a single typical binary produced by a modern language like Go or Rust into its memory - yet they supported dozens of concurrent users and processes, doing everything from running accounting batch jobs, compiling and running programs in Assembler, COBOL, FORTRAN, PL/I, or APL, and running interactive sessions in languages like LISP or BASIC. Part of how they achieved all that was not wasting any bytes they didn’t absolutely have to.
I think this is a generational divide when computers were significantly more limited. If you look at older data formats, they are nearly all fixed width. Either due to punch cards, memory constraints, performance considerations, or just band wagoning, prematurely optimizing for date representation seems silly.
Early mainframes read data on punch cards or tape files which were character-based (or possibly, binary-coded decimal). Two digits were used for the year to save memory.
You are right about the year KYC came into effect, so good job!!
But that makes my point. As of 2002 Madoff had to have clear records of f who he redeemed money to.
When the bankruptcy trustee went after people who redeemed they only went back a few years and were able to use those KYC docs they had since 2002 to get the money back.
FTX on the other hand has very few KYC docs so the trustees are left with wallet addresses.
Not sure how you contact a wallet address to ask for money back.