Hacker News new | past | comments | ask | show | jobs | submit login

Well, the argument that 1995 through 2015 was a cycle completely different from whatever the next one will be, is not without merit.

I mean, it is easy to forget, but many, many things had to come together at one time in order for this to pop off like it did:

* high speed internet

* widespread consumer demand for high speed internet and services

* multi-core, low-power hardware that gave us smartphones and cheapish "device" pcs like tivo/roku/etc.

* widespread cellular and wifi networks

* miniturization and improvement of many types of sensors

* widespread data collection of all types

* massive investments and growth in consumer GPU devices, which underwrote the ML boom

and I'm probably missing some things, but you get the idea.

All of these things had to come together at the same time to give us the boom that we just went through, and it gave us rise to the likes of Google, Facebook, and so on.

This is very unlikely to repeat itself. Those who grew up in the late 90s, early 2000s may not really notice, but the difference between 1995 and 2010 is astronomical.

This is not to say that we're about to crash, or that there won't be another boom in short order, just that it will likely follow a very different pattern than the previous one. The period of 1995-2010/2015 was really a very unique confluence of events, historically speaking, and over what is really a very, very short time frame. Whereas the current boom is built around leveraging the smartphone and widespread internt access, the next will not be, as it will already be filled out by competitors.




It is likely entirely without merit.

Many thing had to come together from 1975 to 1995 to enable the things you're referencing. The list you made is not impressive versus the past, it's normal.

What happened from 1995 to 2015 that was more important than the Internet, transistor, microprocessor, router, DRAM or the GUI? Good luck resolving that debate.

Many thing had to to come together from 1955 to 1975 to....

I don't know how old you are and how familiar you might be with the prior half century plus in tech, but we could be here all day listing the incredible inventions and leaps forward in tech during each of those 20 year periods of time.

Nothing has changed fundamentally about what's occurring in tech. The process continues as before. Each new generation thinks what has happened during its era is particularly special or unique versus the past. We see the same generational bias in most everything, from music to politics.


Yes, I am old enough to remember, and I think you are reading more into my post than was there.

I am not claiming 1995-2015 was unique in the fundamental factors (we are all riding the exponential curve here), simply that the confluence of advances is unique to that time period, and gives you a unique distribution of companies/organizations/industries/etc. that is very different from other time periods.


That's right, and I'm not at all convinced that the author of the TechCrunch article is correct.

But there is one important difference between the past 20 years and the 20 years before that. The number of people participating in a self-employed or entrepreneurial role has been far greater in the past two decades than before.

We did see some of that during the PC revolution as well, but it was disproportionally smaller in scale.

I haven't done the research to say whether there were historical periods before in which such large swaths of the population were gripped by the idea that they could start their own business based on a new technology.

It's possible that it happened before, but I don't think it was like that between 1975 and 1995. Certainly not towards the end of that period because I would remember.


Of course the technology wasn't all invented from scratch in that period, but it was when a lot of things progressed just enough to generate massive markets which pulled a lot of money into the system.


The recent past does appear very special but I wouldn't discount the near future either. e.g. the tech for AlphaGo was made by a startup.

In any case I took the piece at it's title's face value, where startups are nowhere near over. Unicorns shouldn't really factor into that.


> e.g. the tech for AlphaGo was made by a startup.

Making the next AlphaGo is far less accessible than making the next AirBnB. The gold rush where we're basically sticking a web/mobile app on a business and off to the races is ending.


Now we start to stick AI on a business. And like with uber/airbnb where an app became the business, we'll see unicorns where the AI will become the business.


AI is some applied math. But as applied math goes, AI is only a tiny fraction, nearly absurdly narrow, and not very impressive. There's a lot more good applied math to be brought forward to exploit the recent fantastic hardware.


The other half of AI is data. Valuable propietart data. ie credit card transactions or checkin data from foursquare


Can you give some examples of applied math sub fields that haven't been exploited to their fullest? I'm genuinely interested.


Statistical hypothesis tests: Commonly calculations to predict something have two ways to be wrong (A) predict it will happen when it doesn't and (B) predict it won't happen when it does. Then in the context of a statistical hypothesis test, get to address the probabilities of A and B and how to adjust the test to get the combination of A, B like best or get a better test that will give better combinations. If have enough data, then the classic Neyman-Pearson result says how to get the best test. The proof is like investing in real estate: First buy the property with the highest ROI. Then the next highest, etc. until out of money. That's crude but not really wrong. I have a fancy proof based on the Hahn decomposition from the Radon-Nikodym theorem. Well, statistical hypothesis tests are being seriously neglected.

E.g., some tests are distribution-free. And for other tests, will want to make good use of multi-dimensional data, e.g., not just, say, blood pressure or blood sugar level but both of those two jointly. Well, I'm the inventor of the first, and a large, collection of statistical hypothesis tests that are both distribution-free and multidimensional. That work is published, powerful, valuable, but neglected. I did the work for better zero-day detection of anomalies in high end server farms and networks. So, I got a real statistical hypothesis tests, e.g., know the false alarm rate and get to adjust it and get that rate exactly in practice. IMHO, my work totally knocked the socks off the work our group had been doing on that problem with expert systems using data on thresholds. Also, the core math is nothing like what is most popular in AI/ML now and as far as I know nothing like anything even in small niches of AI/ML now.

Once I was asked to predict revenue. We knew the present revenue, and from our planned capacity knew our maximum, target revenue. So, roughly had to interpolate between those two. So, how might that go? Well, assume that the growth is mostly from current happy customers talking to people who are target customers but not customers yet. Let t denote time, in, say, days. At time t, let y(t) be the revenue, in, say, dollars, at time t. Let b be the revenue at full capacity. Let the present be time t = 0 so that the present revenue is y(0). Then the rate of growth should be, first-cut, ballpark, proportional to both the number of customers talking or y(t) and the number of target customers listening or (b - y(t). Of course the rate of growth is the calculus first derivative of y(t) or

d/dt y(t) = y'(t)

Then for some constant of proportionality k, we must have

y'(t) = k y(t) (b - y(t))

Yes, just from freshman calculus, there is a closed form solution. I'm guessing that the solution is a logistic curve. So, the growth starts slowly, climbs quickly as an exponential, and then grows slowly again as it approaches b asymptotically from below. So, get a lazy S curve. So, it's a model of viral growth. Get the whole curve with minimal data, just y(0), b, and the guess for k. The curve looks a lot like growth of several important products, e.g., TV sets. I derived this and used it to save FedEx. For all the interest in viral growth, there should be more interest in that little derivation.

There is the huge field of optimization -- linear, integer linear, network integer linear (gorgeous stuff, especially with the Cunningham strongly feasible ideas), multi-objective linear, quadratic non-linear, non-linear via the Kuhn-Tucker necessary conditions, convex, dynamic, optimal control, etc. optimization. It is a well developed field with a lot known. I've made good attacks on at least three important problems in optimization, via stochastic optimal control, network integer linear programming, and 0-1 integer linear programming via Lagrangian relaxation and attempted several more where ran into too much in politics. Sadly the great work in optimization is neglected in practice.

The world is awash in stochastic processes, but they are neglected in practice. E.g., once for the US Navy, I dug into Blackman and Tukey, got smart on power spectral estimation, IIRC important for cases of filtering, explained to the Navy the facts of life, helped their project, and got a sole source development contract for my company.

The crucial core of my startup is some applied math I derived based on some advanced pure/applied math prerequisites.

And there is a huge body of brilliant work with beautifully done theorems and proofs that can be used to get powerful, valuable new results for particular problems.

Computers are now really good at doing what we tell them to do. Well, IMHO, for what we should tell them to do that isn't just obvious is nearly all from applied math.


Can you point me to some paper I might read? If it's not too much trouble.

I'm CS a grad student and sometimes it's hard to filter out the hype and find promising but underrated ideas among all the noise.


"Paper"? There are lots of pure/applied math journals packed with papers. I touched on the fields of statistics, probability, optimization, and stochastic processes, and each of these fields has their own journals.

Usually a start better than papers in journals is books. A first list of books would be for a good ugrad pure math major. There get to concentrate on analysis, algebra, geometry with some concentration on topology or foundations.

For grad school might want to do well with measure theory, functional analysis, probability based on measure theory, statistics based on that probability, optimization, stochastic processes, numerical analysis, pure/applied algebra (applied algebra -- coding theory), etc.

Then, sure, work with some promising applications and then dig deeper into relevant fields as needed by the applications.

One key to success is good "problem selection". So, with good problem selection, some good background, and maybe some original work, might do really well on a good problem, publish some papers, do a good startup, make some big bucks, etc. That's what I'm working on -- picked my problem, for the first good, an excellent, solution did some original applied math derivations, have my production code in alpha test, 24,000 programming language statements in 100,000 lines of typing.

It's applied math; hopefully it's valuable; but I wouldn't call it either AI or ML.

In case my view is not obvious, it is that the best help for the future of computing is pure/applied math and not much like current computer science. Computer science could help -- just learn and do more pure/applied math.


Time to sell shovels


After the first year in whatever cycle it is always about shovels.


It's why everyone wants to sell "a platform" today...


> I mean, it is easy to forget, but many, many things had to come together at one time in order for this to pop off like it did:

Yup. You describe a gold mine. Well, there's still a lot of gold in there. The amazing hardware developments you describe are not yet fully exploited.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: