Does this work on an iPhone? I was playing around with in-browser 360 degree video a while back and was infuriated to find that absolutely everything I required - WebGL, a Three.JS library, <video> tags, were present - but iPhone Safari will only ever play videos fullscreen through tap to play, so it's impossible to play a video through WebGL.
I really, really wish Apple would drop this restriction. Aside from anything else, the original intention (bandwidth savings) is totally ruined by the fact that everyone is encoding videos as (much larger) animated GIFs to get around the restriction. How about letting videos autoplay, but silently, and have "tap for sound"? Either that or disable autoplaying of GIFs, because the current situation doesn't really make sense.
No, iPhone is not supported yet. but we are working on it and will have the support added shortly. We just have to use HLS instead of MPEG-DASH for the streaming, as Safari on iOS does not implement the HTML5 MSE.
Android Chrome as well requires user input (usually a tap) to play videos. So a less drastic change would for iOS to do the same, not just let videos autoplay.
Also, if you want to use the web audio API to control the volume of videos like this, there's a known issue that prevents you from doing that on Android Chrome if you source the audio from a <video>.
Am I alone in getting aggravated by people considering 360 video VR? If you want proper VR you need stereo display of 360 video. I feel like the lack of stereoscopic video being called VR is poisoning the well for layman.
You are definitely not alone. This comes up whenever someone demos an immersive 360 video player. But, you are being picky. Arguing for hard-edged semantic barriers is possibly the greatest time waster on the internet. It would be more productive to say that immersive, 3-DOF display of mono, 360 video is really crappy VR. Similarly, a good Google Cardboard experience can at least be stereo, but the 3-DOF tracking is probably terrible, so it is also really crappy VR.
Unfortunately, the infancy of consumer VR will not mature instantly into excellent hardware and software production value for all users on day one. Instead, it will follow a power law curve like everything else does. In the mean time, the best we can do is try to remind the millions of people watching mono 360 videos on their terrible-tracking Cardboards that what they are seeing is really, really crappy VR.
I liken it to watching a cellphone video of a live concert on a cellphone. You are just barely watching a concert... And, if you've never been to one, you shouldn't assume you now know what going to an excellent concert is like.
You use a bunch of cameras, calculate a depthmap, and use the spherical video and the depthmap to apply parallax, like this [0, 1] but spherical. It's a kind of faked stereo video.
Google Jump is one such camera jig that use 16 cameras [2].
You record from two sets of cameras positioned an average eye distance apart, then project one set of cameras to the inside of a sphere shown to one eye, and the other set to the other eye.
The limitation is that the cameras are arranged horizontally (current gen), so the stereo effect is lost if you tilt your head, as well as at the north/south poles of the spheres.
And it really needs some degree of positional tracking as well for it to be true VR. Simple stereoscopic 360 video is far more immersive when your view responds realistically to small head movements.
Agree, there is definitely a long way down the road to get to more realistic VR experience for video, especially for streaming. That's currently a starting point.
Have you seen that you can change the rendering modes of the player to also use it on Samsung Gear VR or Google Cardboard?
Slightly OT, but I have been looking for a live streaming / transcoded solution.
IE, webcam in, and streaming video out that doesn't use FMS/Flash/etc. Is it possible in HTML5 yet? Dash seems promising - but is there a totally OSS solution available ?
This is really cool, but still lacks IOS-support, right?
I was developing the app for differentperspectiv.es, and quickly realized that the easiest way for me to get 360-degree video support on multi platforms was to build an Unity app and project movies on sphere-shaped textures.
This project seems cool, but would love to see the HTML5 player being open-sourced.
We are working on the iOS support, this will be added soon. It's pretty simple, we just have to use HLS instead of DASH, as Safari on iOS does not support the HTML5 MSE.
Error while saving Encoding Profile!: field 'videoStreamConfigs.codec' is invalid! valid values: (h264,hevc)
Source is vp8 webm @ 4000x2000. I chose the 1080p adaptive profile and tried to add 4kx2k @ 20mbps and 2kx1k @ 15mbps. Both of the added show encoded of vp8 but don't allow me to change it to h264.
I think it is awesome, and I was looking for something like this to play with (I've landed on bitmovin's page before :) looking for MPEG-DASH solutions).
It does pause/lag a bit when I move the camera around.
I really, really want to try it with live streaming (but I don't have the gear right now)
With this you can host your own 360° videos, and don't need YouTube. This is in particular interesting if you want to monetize by your own, e.g., through ads or subscription. It's basically the infrastructure you need to build Netflix-like services.
I'm pretty sure you can't host it yourself. If you want to use their framework, I'm pretty sure you have to host it through them. At least, I can't find a download link anywhere.
I really, really wish Apple would drop this restriction. Aside from anything else, the original intention (bandwidth savings) is totally ruined by the fact that everyone is encoding videos as (much larger) animated GIFs to get around the restriction. How about letting videos autoplay, but silently, and have "tap for sound"? Either that or disable autoplaying of GIFs, because the current situation doesn't really make sense.