Hacker News new | past | comments | ask | show | jobs | submit | hwers's comments login

Sounds to me like runway released it without consulting stability, called it “1.5” - which according to the license they’re allowed to do but pretty scammy since emad had hyped a model with that label. And now stability is deciding to call this the official release to be nice to runway and avoid a general PR thing and community infight.


To me it seems the other way around.

1.5 was apparently held back by Stability for weeks. Runway finally decided to just release it.

Stability requested a take down, and here you see the Runway CEO telling Stability in no uncertain terms "this ours to release, we created it, it's under an OS license, you don't hold any IP rights here; all you did was provide compute"

If anything this is a pretty stern rebuke of Stability and a sign of considerable disagreement between the two parties.


Well if that’s the case that’s still a pretty shitty thing to do on runways part. Just be curteous to what stability’s needs are, keep good business relations. Weird behaviour and I wouldn’t be surprised if in the future runway are silently excluded from before-public releases (which seems to be many in the years ahead).


Doesn't the "OS license" mean that Runway has permission to release it already? Ack that there might be other agreements and business relations involved though.


Release it, fine. There’s been lots of fine tuned and continued trained SD models. Just don’t call it “1.5” which is the specific label for the model stability is training internally. Again the license ‘permits’ them to do it but seems like a very bad business decision since runway given what their service does would likely benefit hugely from early access to eg stability’s future text2video models (etc), which they now likely won’t get until everyone else (leaving someone else to possibly take market share in their field, and if this is the trade - because they got ‘impatient’ - that seems awfully not smart).


It seems very weird to me that “stability” is building a company around something they didn’t create.


Seems to me that stability AI did a pretty shitty thing and runway ran out of patience. Pretty weird to paint runway as the bad guys here.


didn't runway create the model? How could stability exclude them?


Seems really useful in a wasm context


This is google, they for sure aren’t releasing the weights


They release a lot of weights open source, including T5 (the underlying model they used in this work). They also indicated their intent here: https://twitter.com/aleksandrafaust/status/15799326368934420....



I’m seeing a lot of R&D solely focused on giving them the chance to extract rent through future silly things like buying artificially scarce houses in fbs metaverse. I’m barely seeing any “giving back” type research like AT&T created with their research in the internet.


Their end-game is very much obvious, but honestly they are so bad at it... I doubt they will manage to actually build anything successful to extract any value out of it. The reason why Facebook Metaverse is so cringey is because they have no clue how to make games or anything like it. The "real metaverse" already exist in Roblox, Fortnite and few other popular games that every teenager socialize in.

At the same time thanks to Facebook any average Joe can easily get consumer-ready VR hardware for around $400. We just really dont have many companies behind VR except for Valve and they're simply dont have enough manpower to create mass-market hardware.

My point is that we must appreciate those engineers who persuaded Zuck to spend money on VR hardware and research. Yeah their attempt at "metaverse" is laughtable, but investment into hardware is priceless.


It looks like their AR/VR publication list is here: https://research.facebook.com/publications/research-areas/au...

For publications in general: https://research.facebook.com/publications/


> I’m seeing a lot of R&D solely focused on giving them the chance to extract rent through future silly things like buying artificially scarce houses in fbs metaverse

An example from FB please?


Meta does plenty of FOSS work as far as giving back in concerned. But artificially scarce houses? How about a source for that.


hwers's comment lacks some context, but there is some truth to it. Consider Carmack's comment: https://www.youtube.com/watch?v=BnSUk0je6oo @33:50 onward "A closed platform doesn't deserve to be called a metaverse, or does it?"

Making money in the Metaverse is a pillar of what makes a virtual world an actual world. (Ignoring of who does the money making for now) Scarcity is an integral part of it. Skins, models, rare items. In the virtual world it all scarcity is artificial by definition. Just like with second life and the linden dollar making it to real exchanges. Second life sold worlds as some kind of virtual realtor in the past, VRChat's users sell custom created skins and rigged models today and Meta wants to be the platform facilitating that tomorrow, but on a grand scale.

I also highly applaud Meta's contribution to FOSS btw.


This is extremely well written


Only downside with this is that each mesh takes like 5 hours to generate (on a v100 too). Obviously it’ll speed up but we’re far from the panacea


These things take months and months to train (hardly fast progress). Any new model that’s coming out is generally known in the atmosphere (not unpredictable) and these applications were pretty expected the day stable diffusion came out.


"months and months"

At the beginning of this year, most technical people would have told you that graphic design was a decade from being automated, and creative video production more.

Now we are at "months and months".


That this would have come out was totally predictable and expected to most in the know people in the ML world (we basically had proto versions of it summer 2021). Not really the unpredictable trajectory I associate with a singularity


This sounds really interesting but I’m not sure I follow. Having a hard time expressing how I’m confused though (maybe its unfamiliar nerf terminology) but if you have the time I’d be very interested if you could reformulate this alternative method somehow (I’ve been stuck on this very issue for two days now trying to implement this myself).


NERF is neural radiance fields (the neural 3D reconstruction method that nVidia published recently.)

Basically if I’m reading it right, this does the synthesis in latent space (which describes the scene rather than rendering vocals) then translates it into a NERF. It sounds kind like the Stable Diffusion description that was on here earlier.


Hey as a european where I doubt anything like this would ever happen maybe I should take the perspective on this as a good thing for me in the sense of a lot less future competition


Should the US view it as a good thing that Europe's economy is crumbling under an energy crisis so less future competition for us? I think no.


Unfortunately, our leaders(American) are very pleased at the prospect.

https://youtu.be/GK3u8up_HLA


Well it is what it is. Russia has doomed Europe to a long period of economic decline. The economic opportunity for the US is vast. This could be a period of economic power gains not seen since the end of WWII. To not recognize this would be foolish. It's not even accurate to paint it as "stealing" or "capitalizing" on Europe's weakness. They simply won't be able to produce things, and someone needs to, so it might as well be the US. Europe cannot be helped even if we wanted to.


Well they even have a “downlod model” so yep you definitely can. I wouldn’t think of this as an amazing panacea though, since once everyone has access to it that suddenly means whatever reason making assets like this was valuable before, will now be dirt cheap for all and thus actually net negative for people in that industry. Just saying and warning, not to be a bummer


Thx, I went back and saw that I missed this the first time:

"Mesh exports Our generated NeRF models can be exported to meshes using the marching cubes algorithm for easy integration into 3D renderers or modeling software."

Like they say, "This is the start of something big."


FWIW, there's still a pretty big gap between a single static mesh and something that is a usable asset, say in a game. Maybe this could provide a shortcut for a modeler to get started, but it still is going to take a lot of skill from that point.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: