I'd be more than happy to work with you on this. I run Product & Software here at SP// so I know the right people ;) feel free to email me ben.gabler (at) stackpath (dot) com
Really appreciate your feedback. I definitely want to hear more about the latency spikes, if you don't mind. Also happy to talk further on the price per request and bandwidth pricing logic.
Can you shoot me an email? ben.gabler (at) stackpath (dot) com
This is super helpful, really appreciate it. I will say that this feature along with reporting delivery by file is among the top feature requests on the new SP// platform.
We do have the ability to provide you with access logs, but it's not as convenient as the API above.
We also have several users creating real-time CDN logs with our Serverless EdgeEngine so that's an option as well.
Anyways, I've noted your request around the raw logs API going to the new platform and will be sure to stay on top of it.
Feel free to email me anytime, always happy to help - ben.gabler (at) stackpath (dot) com
Edge can be defined as many things, and is currently being defined by the world in many ways.
At StackPath our definition of Edge is basically being as close as possible to the eyeballs aka users. Think of it as the front door to the internet.
Today our Edge expands across major IX's around the world and that's just the beginning. 5G is approaching us quickly along with container data centers.
The way we built our orchestration system it can deploy and manage workloads anywhere. In the future, that will include 5G container data centers which gets workloads even closer to things like self-driving cars, smart cities, IoT devices, <insert your idea here>.
"workloads even closer to things like self-driving cars, smart cities, IoT devices"
Oh, please. If you really need those last few milliseconds of lag reduced, you need local computation. If you don't, an AWS datacenter on the same continent is probably good enough.
Ben from the StackPath product team here. We agree containers or VM's may not always be the best available option for a workload. That's why we also offer an equivalent product to the one by Cloudflare (great product btw) - https://www.stackpath.com/services/edgeengine/
However, it should be made clear that our container and VM solution is not a "function" type offering. You can deploy a container and/or VM workload on our Edge similar to what you might find at Cloud providers. The main difference is we sit a layer above the cloud providers and make deploying worldwide simple, secure, and fast.
With a few clicks or a single API call you can deploy a micro-service all over the world (even add an anycast IP if you need one) in under 60 seconds.
We are not. Knative is a great project and something we're definitely looking into for more "function" type workload offerings down the road.
Today our container/vm solution does not have a warmup concept other than the initial deployment of your container. You simply specify your image, some attributes, and it's deployed to the locations specified on our Edge. Once deployed, you have the ability to delete the workload at any time, but it's not elastic based on requests to the workload.