It's pretty fun, made it to the $1B valuation on my last attempt :)
I read bits of the instruction guide a couple times, but some things that weren't super clear at first:
1. I didn't realize how important it is to raise another round. Is there ever a reason not to? The money bag was a small little thing, and I didn't know to click on it before reading the guide.
2. 90% monthly churn seems pretty bad... but maybe it isn't? Not sure what to do about it, improve the product maybe?
Should you be doing marketing before product market fit?
3. At first (when I didn't read the guide) I thought marketers help us get customers. But I think they just reduce CAC?
5. Do the tasks matter? I never figured out the what the numbers mean. I never deleted any of them - should I?
6. Do ops people do anything?
7. Task velocity is a bit unclear, and the onboarding is only tied to the blue bar if you read the guide. Would be nice to show on hover maybe, and could show some sort of aggregate across your team?
8. Do I get anything for investing in my product? Does the product get better over time in some way? Would be nice to track that.
I love LinkedIn as a founder/developer. It's become my main social network after the Twitter acquisition.
Your experience depends on what show up in your feed. For me, it's mostly developers talking about web performance optimization, small business owners talking about conferences they've been to, people sharing posts or videos they've released... and the occasional post from the local council or someone criticizing Facebook's AI features.
Posting on LinkedIn doesn't necessarily mean bullshitting or low-value content. My two most popular posts are about single-page app performance [1] and TCP slow start [2]. And when I talk to people using our product they mention they have team members regularly sharing my posts in their company (but they might not "like" it on LinkedIn).
I might not be getting thousands of engagements, but there's little point reaching random people who aren't interested in working on the same problems as me either.
The change (and the Directive) was the result of banks withdrawing fee-free basic banks across the bloc as a cost-cutting measure in the wake of the 2008 financial crisis. The Directive reverse that and made basic bank accounts universally available across the bloc.
Local overrides are super useful for testing site speed:
• Your local setup is likely different from production (not serving from production domains, not using image resize services, using different HTTP compression...)
• You might not be able to run tests on localhost, e.g. if you're an external consultant or working in technical SEO (who often want to give specific recommendations to devs as dev attention is scarce and expensive)
There are still some limitations of testing changes in DevTools:
• Testing a fresh load is a pain (need to clear OS-level DNS cache, need to clear the connection cache in Chrome, need to clear cookies/service worker)
• DevTools throttling doesn't represent server connections accurately (you need to throttle at the OS level for accurate data, which slows down your whole computer and requires admin rights)
WebPageTest [1] and my own tool DebugBear [2] now support experiments to make it easy to try out content changes in a controlled lab environment.
Is this an actual sales objection that comes up? Are potential customers saying that making the code open source would address their concerns, or would the cost to host and maintain the software internally be too high? Would code escrow be an option, so the source only becomes available if you go out of business?
No! But a lot of non-qualified-customers talk up the benefits of going fully open-source, so I take your point it may not be the best audience to be taking advice from? That's a nice idea about code escrow (although I'm slightly troubled by the incentives it creates ha ha!) -- had never heard of that. Do you have any links?
Cost to host should be OK, it's basically fire and forget, ignoring any customizations. Requires a bit of expertise if you want to modify things, so maybe that's an issue?
Someone was CFO at two companies and the auditors only checked the year end balance against his falsified statements. So he transferred money from the other company temporarily to make them match.
"""To avoid detection, Morgenthau doctored African Gold’s monthly bank statements
by, for example, deleting his unauthorized transactions and overstating the available account balance in any given month by as much as $1.19 million. [...]
Morgenthau knew that African Gold’s auditor would confirm directly with the bank the actual account balance as of December 31, 2021, as a part of its year-end audit. [...]
Morgenthau deposited more than half a million dollars of Strategic Metals’ funds into African Gold’s bank account on December 31, 2021, because he knew that African Gold’s auditor would confirm the account balance as of that date, in connection with African Gold’s year-end audit.
"""
Interesting. I guess that is the inherent flaw of all audit methods which predominantly check the paperwork, while rarely venturing out into the real world. With sufficiently bad actors, the whole paperwork can be doctored and completely untethered from reality. Such bad actors need to only make a plausible Potemkin village for the controllers in selected spots where they are expected to verify if reality matches presented paperwork.
Enron was doing similar trick by selling buildings to another business entity, and buying them back after the audit. I might not have all the details correct but it was the same type of shenanigans. :-)
I wondered about that when reading the Money Stuff article about it a while ago. What should they actually have done differently?
One of the issues was that "she could not share her customer list
due to privacy concerns". So maybe JPM could have pushed back against that more?
"""Javice also cited privacy concerns in sharing Frank’s customer data directly with
JPMC. After numerous internal conversations, and in order to allay Javice’s concerns, JPMC
agreed to use a third-party data management vendor, Acxiom, to validate Frank’s customer
information rather than providing the personal identifying information directly to JPMC."""
I was involved in some diligence when a prior company was considering an acquisition. The numbers they claimed vs the numbers we could trust from their various SaaSes were pretty fishy. It was a small deal - more like $1M. We didn't pursue them, they don't exist any longer.
The gap here was _huge_. If I was the JPM diligence team, I might have asked them for read-only access to their product analytics. They claimed something like 10K FAFSA applications/day. This should show up nicely in their analytics tools. Yes, they could fake these visits--but it would be much harder to fake that you're getting 10K visits from appropriate regions, at appropriate times of day, with appropriate dwell times, with appropriate distribution of completion rates.
In most jurisdictions it would generally be possible for the seller to hire outside counsel to validate customer metrics claims under attorney-client privilege without violating consumer privacy laws or customer agreements. The outside attorney could then provide a letter to the buyer attesting to what they found without revealing any specifics about individuals. Of course that would delay the deal, and the buyer here seems to have been irrationally eager to close the acquisition.
"""
After the August 3, 2021 Zoom meeting, the Data Science Professor returned a
signed version of Frank’s NDA. The Data Science Professor’s usual hourly rate was $300.
Javice unilaterally doubled the Data Science Professor’s rate to $600.
[...]
Specifically, on August 5, 2021 at 11:05 a.m., the Data Science Professor
provided Javice an invoice for $13,300, documenting 22.17 hours of work over just three days.
The invoice entries show that the bulk of his time was spent on the main task that Javice retained
the Data Science Professor to perform – making up customer data. The Data Science Professor’s
invoice indicated that he performed “college major generation” and “generation of all features
except for the financials” while creating “first names, last names, emails, phone numbers” and
“looking into whitepages.”
In response to the initial invoice, Javice demanded that he remove all the details
admitting to how they had created fake customers – and added a $4,700 bonus. In an email to
the Data Science Professor at 12:39 p.m. on August 5, 2021, Javice wrote: “send the invoice
back at $18k and just one line item for data analysis.” In total, Javice paid the Data Science
Professor over $800 per hour for his work creating the Fake Customer List, which is 270% of his
usual hourly rate.
The Data Science Professor provided Javice the revised invoice via email seven
minutes later at 12:46 p.m., commenting “Wow. Thank you. Here is the new invoice.”
"""
it sounds like his initial invoice was quite clear in the work completed, then updated at the client's request. So while you can argue moral grounds for not doing this work, I don't think there's illegality, i.e. conspiracy.
I mean if you are a professor and knowledgeable in how the startup uses the data, it’s hardly justifiable that “oh crap i didn’t know they were using it for illegal purposes”.
This is spoken to [in the full complaint][1]. The data scientist was told Frank really did have 4 million users, and the scientist only needed to generate this "synthetic data" as a way to "anonymize" their "real" data. I.e. the scientist was duped:
JAVICE told Scientist-1 [...] that she had a database of approximately 4 million
people and wanted to create a database of anonymized data that mirrored the
statistical properties of the original database (the “Synthetic Data Set”).
[After JAVICE sends Scientist-1 the data], Scientist-1 understood that the data
available via the Access Link Email -
**a data set of approximately 142,000 people** (emphasis added) -
was a random sample of a larger database which contained data for approximately
4 million people. In fact, that data represented every Frank user who had at
least started a FAFSA.
This happened in the late seventies. Nowadays you'd have a hard (though not impossible) time doing it because of federal laws against ageism. However the tech industry seems to get away with it in hiring.
I read bits of the instruction guide a couple times, but some things that weren't super clear at first:
1. I didn't realize how important it is to raise another round. Is there ever a reason not to? The money bag was a small little thing, and I didn't know to click on it before reading the guide.
2. 90% monthly churn seems pretty bad... but maybe it isn't? Not sure what to do about it, improve the product maybe? Should you be doing marketing before product market fit?
3. At first (when I didn't read the guide) I thought marketers help us get customers. But I think they just reduce CAC?
4. Once you have a larger team it gets very heavy on the "Build" clicking. https://drive.google.com/file/d/1GRpxPfcpAjWem-3gs2Tnd_5xDwF...
5. Do the tasks matter? I never figured out the what the numbers mean. I never deleted any of them - should I?
6. Do ops people do anything?
7. Task velocity is a bit unclear, and the onboarding is only tied to the blue bar if you read the guide. Would be nice to show on hover maybe, and could show some sort of aggregate across your team?
8. Do I get anything for investing in my product? Does the product get better over time in some way? Would be nice to track that.