I'm... not convinced this is a decent offering. At the same time, I'm repeatedly wrong with this whole cloud computing, so maybe I'm just that person stuck with the old ways and telling everyone else they're doing it wrong.
But it seems like a recipe that combines so many of the worst parts of running a groundstation. The setup can't easily be specialized enough to really milk capability (say, different ways to correct pointing and orbit determination errors that need more direct control of the antenna than just feeding a TLE), but it's likely so general that there's still significant engineering effort for anybody who wants to use it. If you want to do anything requiring round-trips to the satellite, where you need to ship raw radio signals to a DC, process them in software to demodulate and decode, then encode and modulate your responce before shipping it back as raw radio signals to the groundstation, sorry but that latency is painful and hurts you on link utilization.
In comparison, buying a small turn-key dish and operating it is not that bad. Paying per-minute or per-pass gets really expensive for smallsat/cubesat operators, at least at the usual prices I've seen and compared to the operating cost of equipment you own. Also, if you're just running a tech demo then it's not exactly prohibitive to partner with somebody who has excess capacity.
I think this offering is missing the mark. If I had to make an analogy, I'd say they're offering roughly "GCP for downlink" to a world that's not quite homogeneous enough for it to make sense. If I were trying to do this, I'd be aiming to be the "Squarespace of downlink" (more tightly scoped capabilities, but much better performance and more turn-key) or someone delivering groundstation-in-a-box kits.
Context: I used to work at Planet Labs, having spent considerable time collaborating with the groundstations team, as well as some collaboration with the missions ops team.
I don't know anything about this field, so I'm hoping I could bug you with some questions because it seems really fascinating!
First of all, could you give some concrete examples of the kinds of data people using this would be handling? The examples provided in the article are incredibly vague, and the linked Wikipedia articles didn't help clarify.
Could you give some ballpark numbers and breakdown of the costs involved? How much does it cost to manufacture and launch a small satellite? What about buying a dish and operating your own ground station? What about regulatory fees for licensing and registration? There's a few mentions of hobbyists, but I have a hard time imagining many people with enough money and interest to deal with all these challenges. Is there some secret cabal of millionaires and billionaires that are really into space?
> First of all, could you give some concrete examples of the kinds of data people using this would be handling?
Planet operates imagery satellites, and as far as I know that's the bulk of commercial data _produced_ in orbit that needs to be downlinked. Other commercial services that
could include radar data (ICEYE, Capella), ship/plane monitoring (Spire, Iridium NEXT), or high-latency low-bandwidth data (think devices out in the field, on vehicles, with relatively light telemetry and/or control needs). Telecom is usually "bent pipe" (think dumb repeaters) and not any kind of store and forward setup that would use this.
In the future, I could see some broadening of the field with more tech demo/development that needs dedicated hardware, and the possibility of commercial experiments run in an automated orbiting lab downlinking data.
Of course, all of those also need what's called TT&C (Telemetry, Tracking, and Control) links, which I've nearly always seen on a separate physical radio. That tends to be a less demanding link, and more likely to be able to make use of something like this IMO.
> Could you give some ballpark numbers and breakdown of the costs involved?
Unfortunately, I very much cannot do more than point at what's already public; that's one thing Planet is very tight-lipped about. The idea that one spends about as much on launch as on the hardware they launch seems about right, you can go see what Rocketlab[1] and SpaceX[2] quote as prices. I'm sure you can imagine there's still a lot of negotiations once you're talking about big stickers.
Someone's already mentioned SatNOGS[3] further down in the comments. They seem neat although I haven't looked too much into them yet. Schools are spinning up a surprising number of satellite projects, and amateurs are already developing hardware, running tests, and practicing with radioing the hardware that's already up there. I'm sure it's not far from some pitching together on the group buy of a very small amount of launch space.
> Is there some secret cabal of millionaires and billionaires that are really into space?
Yes, but they're not so secret. Elon Musk, Jeff Bezos, Richard Branson, off the top of my head. DFJ and Data Collective are VCs that have been fairly interested, I'm sure there are more looking to get a piece now.
Whoa! Fancy seeing you here, Pat. Always cool to see a fellow Planeteer in the wild :)
also, ditto what he said. This news was certainly an interesting conversation that happened at Planet HQ today, but a couple of folks knew the people doing this work in Amazon before it was announced today. There are a lot of regulatory, licensing, and construction hurdles. Not the mention, the placement of the ground stations is going to be crucial, but otherwise...I can see upstarts in the smallsat industry totally taking advantage of this should their orbits compliment the deployment sites of the ground stations.
We build and operate our own ground stations for the most part and I can understand/sympathize that we're fortunate enough to do so since much of our business depends on it. It's a costly exercise for sure...
Is buying a 'turn-key' dish going to be easier than using their service? I assume the 'can't easily be specialized enough to really milk capability' part also applies to the off-the-shelf system (disclaimer: I know zilch about this).
They say it's going to save at least 80% over the cost of operating your own, can only assume they will price it very competitively and have done the math. Hard to say anything without seeing numbers.
You're absolutely right that it depends on pricing, assuming what they offer has the technical capabilities to do what a customer needs. But how they calculate that 80% matters too, and if it's measured against a more "traditional" groundstation contract I'd say that's also being misleading.
I've seen new turn-key installations come with an equivalent UI and an API to match it. Plus a bunch of debugging/calibration/testing controls. And then (assuming licensing, real estate, and services availability) you get to drop that "anywhere" and optimize its geographical location to match up with your system requirements.
Turn-key systems will totally let you specialize it to maximize utility. Selection of RF feeds/hardware on the input side, and eg. hardware data modems and co-located command-and-control servers on the output side.
You're probably right. I'd guess this will evolve and improve over time as the the AWS team better understands market needs. It's also quite likely that satellite designs will evolve to take be more compatible with AWS Groundstation. In fact, once the service has been around for a few years, I wouldn't be surprised if we see purpose built sats that aim to use this exclusively. The 12 planned stations could also provide much better coverage i.e. save the sat owner having to build stations around the globe.
At a glance, that's offering funding for commoditized satellite busses (the infrastructure of a physical satellite) and payloads, so none of that goes to the groundlink. I'd also suggest that this funding offer is trailing the direction that the community is headed, as opposed to leading it. There are all kinds of players trying to do this.
I definitely agree with this. It looks very good on paper but the implementation is often just as difficult as developing your own groundstation. I used to work in mission operations at a small commercial satellite company and we looked at using a service such as the one Amazon is expecting to provide still required installing additional satcom modems and equipment at the groundstation.
That said, there was significant difficulty in getting a dedicated groundstation due to real estate and licensing issues. So I think that it is still an attractive option for those who may not be able to furnish their own ground station depending on the constraints of the mission.
also x planet and x spire. what if i told you, a good rule of thumb was 1/3rd for launch 1/3 rd for launch and 1/3rd for everything else. this is a huge game changer if anyone uses this stripped down early product, which i agree is very limited. You want a lot more thought on tracking and the different options for ground to space comms
Oh, hey Newman, we chatted a couple times while you were at Planet. Sure, the needs between early testing and production are different, but I still argue that this does neither. When we were doing early work on the satellites and ground segments, being able to get right into the belly of every part of the system was super useful, as we dealt with different things acting up, weird workarounds, unsure technology directions... This is a capability you get by being involved at all levels of a transparent tech stack and having some good generalists on hand to get it all going.
Of course, I'm biased; I say this as a generalist who has done work inside the equipment room at the base of dishes, hooking up my own hardware and managing live passes, as well as debugging via interactive sessions with satellites.
I guess it depends a lot on how much control they allow the offeree over the antenna, ground station configuration, and signal processing. Anyone that cares about the data quality they are getting from their satellites is not going to trust a 3rd party solution.
People that record music professionally don't buy an android phone and settle for that because of the severe restrictions on the hardware optimisation it imposes.
Likewise, people that are concerned about optimising the radio link are not likely to accept restrictions in the receiver design for instance.
What kind of restrictions? Isn't a VITA data stream broadband SDR data? The entire radio implementation would seemingly be under the control of the client.
I would assume that one of the value propositions of having many ground stations to choose from is that one could simply rent one that was closest to the next flyover.
So I'm curious what factors you are thinking of that would not make sense as part of the service offering but would be needed by would-be customers of the service?
Imagine Amazon building and launching satellites like space based data centers. AWS in orbit. You could provision VMs on a satellite equipped with an array of general sensors, cameras, etc.
Considering the difficulty of dumping heat in a vacuum and the cost of orbiting huge heat exchangers, I wager that heavy computing shall remain a mostly planetary activity for quite a while longer.
Heat is a problem, but I think it is safe to say that power is a bigger one. Those big arrays are really expensive, really heavy, degrade in output over time, and have been known to fail on occasion. [0][1]
Sadly no. Space is cold but also very low on convenient mass to transfer the heat into. Vacuum is an ideal insulator. On Earth, you can put the heat into water, air, thermal pastes, etc., but in space, you have only slow old black body radiation, unless you do something more creative.
Space in itself has no temperature. It does have some background radiation, and thus you can measure its temperature (pretty cold at something slightly over absolute zero).
Here's the thing though. There's very little matter even in low earth orbit, so you can't use matter to transfer heat, as you would do on Earth (conduction and convection are out). That leaves only radiation, which requires a pretty large surface area. Try placing a computer on vacuum even on Earth and report back how the temperature looks like.
So no. Getting rid of heat is a big issue over there. Even in the shade.
Not as well as intuition suggests. Deep space is cold, but also the sun (and reflected energy from the Earth!) is pretty hot. Without air, there's no convection so it's all radiated energy transfer, so you're limited by how much surface area you can "point at" deep space while not also pointing it at the sun.
Space near Earth is hot (apparently, ~ 120°C) in the sun, and cold in the shade (e.g., the Earth's shadow). On average, it's not particularly cold. You can get colder average temperatures on Earth, without the wide swings.
But more importantly, space is empty, and the best way to cool is to dump heat in some medium which carries it away (air is the basic one on Earth, but applications that really need cooling like to use water as a primary medium.) In space you've got...nothing, basically, somyou are stuck with radiating, which works, but poorly.
It's not that heat can't go anywhere, it's that there's nothing to carry heat away.
On Earth, air molecules can carry heat away from you. You transfer heat to the air around you. In space, there is almost no matter around you to absorb the heat and carry it away from you.
Your only option is to radiate heat away from you in the form of infrared light but that is a slow process.
Vacuums (or near vacuums) are actually good insulators. Double-walled insulated travel mugs are a great example of this.
Also, if you think about this in the context of the Sun, you get a really visceral feel for just how much energy is being produced. Black-body alone is transferring that energy to us. Scary.
The SciFi novel "Sundiver" by David Brin is based around using a laser to dissipate heat from a manned spacecraft. Sadly thermodynamics mean it wouldn't work in reality.
Genuinely curious - what would be a good use for a generic cubesat with average-ish components, like cameras, sensors, radios, etc. - nothing custom made and overly specialized? Especially considering the cost.
I would guess you'd have a relay for any data you want transferred from non-connected devices/locations, e.g. farm equipment, wind turbines, ships, etc. Also, a client-to-space-to-EC2-only VPN in a I-am-my-own-satellite ISP may have benefits. Unsure whether by "average-ish" you meant the ability to broadcast/receive ground data.
Is it really cost effective to roll your own for that kind of thing? I would have thought there were enough off the shelf satellite/m2m solutions for remote connectivity, eg https://www.orbcomm.com/en/networks/satellite
> Is it really cost effective to roll your own for that kind of thing?
Nope, hence leveraging Amazon et al. I would consider the AWS "above the cloud" offering (as I've now dubbed it) to essentially be the same commoditized thing as what you linked. An extraterrestrial CDN with edge compute if you will.
A ground station is going to involve some RF equipment first: an LNA to boost the received signal then that feeds into a down convertor to bring it to 1200MHz typically. The 1200MHz signal goes into a modem which can either be something very specific (think satellite TV set top box) or something very capable (think big server with a custom ADC capture card in the back). The modem will do some forward error correction. FEC can involve a mix of Reed-Soloman, Viterbi, and LDPC. The bits can now be sent out over a network connection. Uplink involves basically the reverse process except there is usually no FEC since you can always boost your ground antenna to be much more powerful than the one in space. An exception to that would be uplink signals meant to be relayed to the ground. More of those "bent pipe" satellites are becoming digital which allows them to process signals and make them smarter about channel utilization.
All of this equipment can usually fit within a rack or two. The downlink stuff can fit within a few U. Uplink involves some big power amplifiers. Back in the 60s it would have involved dozens of racks of equipment.
Eh. Space is just a place that's annoying to ship to. (With the one key advantage of having line-of-sight to arbitrary deployed IoT devices.)
"Underwater", meanwhile, is more of a mode of operation / design for data centers, with the goal of that simplifying (and decreasing costs for) cooling.
An underwater data center doesn't imply that the data center is actually off-shore to any large degree. I'd imagine they'd look more like undersea-cable landing points, built "into" a beach.
I tried looking for environmental impact studies for Microsoft's data center project, but all I could find were object tracking models that their team built for tracking marine life.
If a published study on this exists, I'd love to take a look.
Not likely, water is a good heat conductor so there will just be a area of slightly warmer water near the outlet. You might get a micro ecology with some tropical fish far from their normal grounds but nothing dramatic will happen.
It'll certainly not be worse for the environment than any normal data centre, unless it springs a leak and something toxic escapes.
Maybe, but it could also reduce fossil fuels used to operate less efficient cooling systems in traditional data centers. I’d love to see a breakdown on the impact of each approach.
Underwater data centers make a great deal of sense - close to customers (low latency), fast deployments (decision to power on in 90 days), and, of course, cooling efficiencies. I expect them to actually happen if Phase 2 goes well.
Then again, I'm a bit biased - I worked on both Phase 1 and 2.
Did you know that eric schmidt funds a deep sea drone submersible project out of alameda to map and sensor the oceans... its funded under his and his wifes foundation, but basically its a secret project to map the oceans. They want to be first to own all that data
It would finally put the cloud in the sky! Marketing will be ecstatic over all the sky based puns they'll be allowed.
Also, I guess it would solve the problem of it being expensive to put things in space by virtue of somehow having made someone else pay for vast overcapacity of sensors.
Sounds like this was built for DigitalGlobe (announced last year they are going all in on AWS and moved ~100PB library into AWS). Satellite directly to S3 sounds perfect for them.
Seems somewhat similar to what Descartes Labs is doing - a "data refinery that combines data from diverse sources, cleans it up and makes it ready for modeling - and a platform to upon which to build living, learning models" [1]
I'm the CEO of Descartes Labs. It's indeed true that we've built a data refinery that ingests lots and lots of satellite data. The data refinery can be seen as a two-sided marketplace. On one side, we form partnerships with satellite and other geospatial data companies (in addition to open source data from NASA, ESA,and others) and pull in all of that data. On the other side, scientists can run computations over huge amounts of data from multiple datasets. For now, most of our business has been done on the scientist side. In principle, we could provide our infrastructure to satellite companies so they don't have to build out the software on their own. Most hardware companies suck at being software companies.
Amazon's offering is geared more towards ground stations, but they might move up the stack and start providing data refinery-type services on top of the ground station work.
Oh, our entire stack is built on Google Cloud Platform.
Can you give an example of "refined" data versus what one might get through the AWS product? ... in order to demonstrate the sort of tech skills and effort that differentiate the two. I'm basically hoping for some symbol grounding for "data refinery".
Also, what's the TAM in your specific market vs the AWS product's market? How has the TAM changed in the past 5 years?
Lastly, I heard your head of engineering brews better beer than any of your competitors. Is that true? Can be provide samples?
Refined data probably means getting insights from the vast amount of aerial imagery (and related like SAR or elevation model) data that has become available in recent years. Having access to the data is one thing (e.g. Landsat and Sentinel provided by Google, Amazon and other parties), but processing it efficiently is still non-trivial.
Examples include Land-Cover-Mapping (mapping pixels to classes like forests, urban areas, water, etc.) which can then further be used to do crop monitoring or land-use monitoring.
I guess this is different than the product AWS is offering here, which is more about getting the data from/to the satellite, but not about processing (at least for now).
Yeah these are classic examples of 'providing business value'. Want to be an agricultural tech company? Ingest some satellite data and calculate NDVI, boom you now know machine learning, data science, and have created a great business product that helps save the world by making farmers better.
Satellite data imo sucks, especially aerial imagery. Too many damn clouds to get anything useful in real time haha!
Interesting. At first I thought you meant Bentley Descartes, the CAD/BIM/GIS/photogrammetry reality modelling software. It's entirely different, though, and interesting that the two are sort of related and have the same name.
Labs looks very interesting, very lofty yet achievable goals.
Identify high yield market, commoditize, everyone benefits, except the old school and their previously captive audience and mandatory long-term commitments. Sound familiar?
I wonder if the same principle could work for telescopes, or are radio telescope designs far too specialized to be a commodity? Huge installations funded in spite political boundaries might be possible in such a scenario
I would predict the opposite. If you have 4,000 satellites you can justify ground stations but if you have one cubesat you might prefer to rent by the minute.
Is starlink going to work for third party satellite communication? Somehow I assumed that the antennas would be directional towards earth. Also the proposed ground stations are said to be pizza box sized, a bit large for a cubesat if they can't be miniaturized.
I think there’s going to be an API for everything renissance and this is just another example. As a coder being able to automate all the things and create new businesses, processes, workflows it’s a great time to be a coder.
I can see how you might have parsed it that way, but I think he meant "(API for everything) renaissance" rather than "API for (everything renaissance)" (which makes a lot more sense if it was an API that related to 16th century paintings or something).
I feel this is an early part of Jeff Bezos's plan to build out the infrastructure needed for space exploration, as he once did to power the Web with AWS.
ViaSat is also trying to do something similar with a "pay by the minutes" usage for their groundstations and communication satellites. Although the main goal for the ViaSat project is continuous coverage so that you can get data from satellites in near real-time.
I guess the idea is that if the connection weren't encrypted you could listen to anything out there (like satellites used for transmitting weather information to ships at sea).
The ITAR laws can also refer to data. A lot of the data coming from the ITAR restricted technology is also ITAR controlled because in theory you can inference something about how it works based on the data that is coming from it. This is typically true of data coming from spacecraft payloads or new technology.
Encryption is still not necessarily required, unless you are landing the signal only on a groundstation in the United States.
AWS can handle ITAR controlled data though. They already have the AWS GovCloud for data subjet to ITAR restrictions.
Yes, ITAR also relates to plans, schematics, software, data, all sorts of things. If anything, the ITAR language itself is very vague.
But usually the science type or payload data is one thing, and then the lower level hardware telemetry is done in a different way.
I've used GovCloud to store ITAR data. It's cool. If you encrypt your ITAR data, you can also store it in a public cloud like S3, but just for storage, you shouldn't decrypt it there or have the keys there.
Pretty sure NOAA also regulates the use and encryption of the data collected. Also the keys are typically in an HSM with lots of security protocol and air gap, so good luck with that.
NOAA has (though it may have changed recently) had regulatory control over some types of data collection, such as for any imaging data captured, and indeed had requirements on encryption, both on the link later and even on disk and data encryption for groundstation sites not owned by the operator. It's not a standardized specification for how things are encrypted and it doesn't apply to all data, but there is a process.
NASA telescopes used to be governed by ITAR. They've relaxed those a lot, but you had to use software to transmit the data encrypted all the way to your science pipelines (e.g. Telescope to White Sands to Goddard to $INSTITUTION)
> NASA’s telescopes are basically spy satellites pointed outwards.
Literally in at least one case. The NRA donated two surveillance satellites to NASA back in 2012 [1] and the plan is to fly one of them as the WFIRST telescope [2]. I don't think there is a launch date yet.
Of intelligence data there is. Read up on ITAR and EAR. So you don’t think that the satellites AWS will interface with will be facing earth with sensors?
ITAR is pretty limited in what it considers intelligence/military. Unless you're looking for missile launches or doing remote sensing with technologies that are better than what's available to the general public, it probably won't apply. There's plenty of interesting things you can do without running afoul of ITAR/USML definitions.
Some telemetry and sensor data may be EAR99, which is as far as export restrictions go is about as nonrestrictive as it gets. For example iTunes is EAR99— that's why the EULA says you're not allowed to use it to develop nuclear or biological weapons.
I could imagine companies like Planet using this for more points of presence. You only get tiny windows of time to download bits from satellites to ground stations and you have limited storage capacity. Additional ground sites could really help companies get more data out of fewer craft.
(Of course using a remote ground station operated by someone else completely defeats the purpose of Blockstream Satellite, but it would be a fun "hello world" for AWS Ground Station, if it were self-serve)
Another amateur radio sidenote for the day[0], SATNOGS[1] performs a similar task by tracking and recording satellite passes of LEO cubesats with amateur radio telemetry and communication payloads via a network of DIY ground stations. Anyone can see a history of any pass of a tracked satellite over a particular station including a RF spectrum waterfall, audio capture, and telemetry output if supported.
The SATNOGS receivers, usually a low-cost RTLSDR, cover a much wider spectrum than just amateur bands (which prohibits commercial use of it's spectrum), so I wouldn't be surprised if non-amateur entities are eyeing the service, or if SATNOGS organization is thinking about going commerical (although there's already competition in this field besides AWS Ground Station).
Looks very interesting. Looks like they use the NORAD ID to get frequency information for the various channels? I wonder what frequencies/bands they will support (since this gets down to the RF hardware on the groundstations)?
Since they're asking for FCC license info and there isn't a canonical Norad ID to frequency source, I assume the Norad info is just for tracking, Doppler, and scheduling of passes.
Yeah I just didn't see any box to put in a frequency on that page though, although you can pick any NORAD ID. Maybe this is part of the "contact us" procedure. I'm also guessing that the USSPACECOM will tell Amazon to not let you try to follow anything classified.
This is missing so much key information into its usefulness that I really question who this article is geared towards. There's no mention of supported frequency ranges, receiver sensitivity, antenna sizes, polarization, location of the stations. They talk only about downlink, so I assume that this is only for downlink and that there's no uplink possibility, but if there is there's a whole new range of technical specifications that would need to be discussed.
I usually say it's 12-18 months between when an AWS service is announced and when it's actually useful and stable enough to use in production. In this case, I'd easily double that.
What I would like to see is “satellite as a service” for satellites to have rentable space for uploading code, and accessing their capabilities in a colocated way or full instance only (without breaking them... you’d need to pay to cover the insurance too).
It’s far cheaper to send bits to space than actual matter on a rocket.
It would also promote re-use of satellites and mitigate Kessler syndrome (unless induced demand makes it worse).
There are a number of satellites with publicly available downlinks (think NOAA weather sats). I'm curious about why I'd need to register that NORAD ID with my AWS account to be able to receive that data - is it just so that the orbital passes will show up appropriately when I want to reserve a contact or something?
(Also, the "imaginary" sat that was added to the account in this case was in fact NOAA 15. That one actually transmits realtime imagery to the ground that anyone can receive).
Anyone know how big or sensitive their dishes are? I wonder if they could be used for some fun astronomy stuff when not being used to talk to sats. Also curious about the transcievers and so on. Not sure if they'd disclose any of this due to security concerns.
Also, I'm guessing you need to provide your NORAD ID and FCC license so these dishes don't get used for sigint.
It's about not having to build your own Ground Station. To me "Downlink" implies that it's just the link between two pre-existing things, the satellite and something on the ground.
But there are no clouds in space! Unless your particular spacecraft is flying through the Oort cloud I suppose, but surely there's not so many of those satellites out there that would make it worthwhile for AWS to start an entire service for that?
This is cool but now I'm more concerned in learning how launching micro satellites has become so easy for the masses which would be my opposite reaction normally. Meaning this is not without bad side effects. I watched "in a nutshell" about the end of space in that orbiting debris is becoming an issue as its an exponential threat over time as small debris moving at 20 K mph vaporizes things it comes in contact with, hence more debris, more collisions and so forth eventually creating a death shield to any incoming/outgoing space transport, effectively halting space travel for centuries. Here is that video FYI https://www.youtube.com/watch?v=yS1ibDImAYU I don't see how this is helping but possibly there will be cleanup solutions.
These micro-satellites are not high enough up for them or any debris to remain in orbit for a significant amount of time without station-keeping. Cube-sats will usually only last a few years before de-orbiting due to the very slight atmospheric drag at the height they operate. This is a deliberate decision to avoid the issue you mention (and the various space agencies do co-ordinate on this: see https://www.iadc-online.org/). They also have rules which apply to larger satellites which mean that mean if it does not have a naturally decaying orbit there needs to be a plan to de-orbit it or put it into a 'safe' orbit at end-of-life.
I think he is concerned that now that there is an easy way to set up an antenna on the ground, there will be a wave of rocket launches sending new satellites into orbit.
Of course, setting up a radio receiver on the ground is not the hard part about launching satellites, so this is unlikely to change much. When AWS starts letting you launch satellites for $1.99/pound, then we can start worrying.
Altough that would be awesome. Spin up additional satellites via API. Takes a few days until they’re online, but at least they stream the launch video feed in the AWS console.
Also, I think they kind of jumped the shark with this one.
The article mentions ubiquitous micro satellites. This tech is simply following from that although not directly stated in the title. The fact that these micro satellites are cheap and so easy to cookie cutter and deploy is great in terms of tech but seems to compound the issue of space debris. Anyway, just after becoming aware of this I was less enthusiastic about the communications aspects.
Down linking data has never been the particularly difficult part of working with a satellite though getting it up there with all the approvals that requires was and still is the major hurdle. Just because it's easy to downlink doesn't mean the FCC is going to suddenly rubberstamp every startup's 10k satellite constellation.
Depends how much data you want to downlink though. Sats in LEO have fairly small footprints and you can only get data when that footprint is over a ground station.
But it seems like a recipe that combines so many of the worst parts of running a groundstation. The setup can't easily be specialized enough to really milk capability (say, different ways to correct pointing and orbit determination errors that need more direct control of the antenna than just feeding a TLE), but it's likely so general that there's still significant engineering effort for anybody who wants to use it. If you want to do anything requiring round-trips to the satellite, where you need to ship raw radio signals to a DC, process them in software to demodulate and decode, then encode and modulate your responce before shipping it back as raw radio signals to the groundstation, sorry but that latency is painful and hurts you on link utilization.
In comparison, buying a small turn-key dish and operating it is not that bad. Paying per-minute or per-pass gets really expensive for smallsat/cubesat operators, at least at the usual prices I've seen and compared to the operating cost of equipment you own. Also, if you're just running a tech demo then it's not exactly prohibitive to partner with somebody who has excess capacity.
I think this offering is missing the mark. If I had to make an analogy, I'd say they're offering roughly "GCP for downlink" to a world that's not quite homogeneous enough for it to make sense. If I were trying to do this, I'd be aiming to be the "Squarespace of downlink" (more tightly scoped capabilities, but much better performance and more turn-key) or someone delivering groundstation-in-a-box kits.
Context: I used to work at Planet Labs, having spent considerable time collaborating with the groundstations team, as well as some collaboration with the missions ops team.