I really want VR computer monitors so that I sit at a desk with no monitors so that I can run 4,5,6, any number I want or just 1 big monitor. Also it would give me a lot more privacy as no one else could see my monitors unless I shared the monitor to them which I imagine would be a little easier to do with virtual monitors than when physical ones (even considering screen sharing which is iffy sometimes).
Your use case is particularly apt for frequent fliers. I've almost had a tray table-supported laptop destroyed on more than one occasion by a reclining passenger in front of me. It would be amazing to just set up a small keyboard+trackpad on my lap and not have to even pull the computer out of its bag.
If I could do this, I wouldn't even care how ridiculous I looked.
I, too, want to be a gargoyle. The abundance of virtual space is one of the main draws that I don't think people really understand yet. But if you have a sufficiently hi-resolution display, you could easily watch an imax movie or put an insanely large amount of data in front of you in an intuitive, customisable way. This is one of the main points where I think people really need to see it done well to understand the potential. Good thing Carmack is on the case.
Not even close. The screens for each eye probably have about 1440 pixels of vertical resolution (I'm guessing they are Samsung OLED screens, similar to the display on the Galaxy S6). Imagine that resolution stretched out to fill your field of view and distorted by the lenses -- the perceptual resolution available for a virtual monitor would probably be less than VGA.
VR environments that can do high-quality typography will require something like 10k pixels vertical resolution. It's going to be more than a decade until we have the screens and GPUs that can push those pixels.
(I have a Samsung Gear VR with the Galaxy Note 4 which is a 2560*1440 screen split for two eyes, so I have a pretty good idea of what Oculus VR looks like on a 1440px screen.)
VGA had 480 scanlines. When was the last time anyone seriously programmed on a display worse than that?
My guess would be Commodore 64 with an NTSC TV for display. That's pretty close to the text fidelity you can reproduce on a "virtual monitor" in a current-generation VR environment. (I think a lot of people underestimate the amount of resolution lost due to lens correction -- the rendered screen image is heavily distorted to account for the wide-angle lenses.)
Well, I suppose it would be sort of fun to have a virtual C64, as long as the virtual surroundings are well executed: a complete high school kid's bedroom from 1986.
I recall being my most productive on an old IBM 3151 terminal (probably because of the keyboard, but also fewer distractions). It was only 80x24 characters. And 80x24 is easily represented in a 640x480 screen. Now add in head tracking, and you can have the screen move up and down and scroll the text in the opposite direction, making it appear as if you are moving a virtual picture frame over a bigger sheet of paper.
Even today, most of my (C / Bash) programming is done in an 80x24 xterm, although I'll typically have documentation open in another window. However web and GUI app development requires enough room to see the resulting product.
Although I have much higher resolution, lately some occular migranes have forced me to up the font size on my monitor. Having really big type seems to stave off the migranes for some reason.
Anyway, I think I've been going for about a month and a half with 80x27 text displays (full screen with tmux). I'm doing web development and my browser is similarly scaled (thank god we're building something with a scalable UI).
I have not missed the screen real estate at all. Of course, I'm old and I used to always work on 80x25, so the extra 2 lines are luxury ;-)
For one of my projects, I am writing code Thet is meant to be read by the 99%, i.e. formatted to be readable on a phone. 40 chars wide, and I rarely look at more than 10 lines at a time.
It's generally fine. Every now and then I feel like I want to see the broader context of the code, but it's rare. In general it forces me to write much cleaner, more readable code, and to structure things better so that concerns are truly isolated and I don't need to look at a lot of code to see how things work. Overall I think it's a positive experience.
other than "can't fall off a seat tray" there isn't much point in having 4 virtual monitors if it gives you less readable real-estate than a real monitor.
Really? Even if the visible, readable real-estate is limited, I think there's be a huge usability boost from using... "virtual monitors" is a terribly ambiguous phrase... virtually co-located monitors?
The key metric for me is how much information I can access with input under a minimal threshold. From personal experience, a hotkey to swap virtual desktops (e.g. Alt+Up) still isn't the same as having multiple physical monitors to reference.
However, I'd expect VR head orientation changes to look at different monitors to be fairly similar perceptually to what I do now, since it's the same physical action.
The bigger problem for the seat-tray problem is, afaik, both Oculus and the Vive use externally located tracking devices. Would be curious whether a fuzzier, internal-sensor-only, limited "intent" tracking mode (e.g. flick head to switch monitor) would make people hurl or not.
> Would be curious whether a fuzzier, internal-sensor-only, limited "intent" tracking mode (e.g. flick head to switch monitor) would make people hurl or not.
Yes, it would. This has been studied pretty extensively, the head movements need to be very precisely matched by rendering. The absolute worst you can do is any kind of non-linear response -- acceleration + lag can make people who are very tolerant of VR nausea literally throw up.
Its that bad. Its a well-studied phenomenon called 'simulator sickness'. A kind of aphasia, for some it can persist for days beyond the initial experience.
If you don't experience it, good for you, you are one of the lucky ones.
No, you don't know what you're talking about. Simulator sickness is not a black or white issue. It's even possible to avoid it by making the display worse.
Makes sense - worse means less involvement with your sensory expectations. Its when you're brain is convinced it should sense vestibular changes and it doesn't, that you throw up, get nauseous and dizzy etc. The difference between 'looking at' and 'believe you're inside of' a simulation hinges on the quality of the experience.
Oculus has been riding the smartphone screen density wave but I'm not sure that will continue much longer since for phone use going beyond 500ppi doesn't make much difference.
Maybe some kind of projected light field will be the next solution?
GPUs will keep getting faster and smaller. Have you seen NVIDIA X1?
I've seen the idea of multi-resolution rendering floating around, which I think has some promise when combined with eye tracking, would help with the GPU side of things.
Not so much on the "raw display hardware" side of things though. I could still see it being a decade+ out on that alone.
Cool stuff, but it needs extremely fast and low latency eye tracking to work effectively. It's not insurmountable, but a significant engineering problem.
I'm not criticising your project at all -- just saying that my experience with current VR display tech is such that I can't imagine reading text for more than 30 seconds. The aliasing and distortion would make my eyes bleed pretty quickly.
The time will come for textual VR environments though! I can't wait to have "newspaper resolution" for textures (hold up a virtual newspaper in a VR world and you can actually read the text like it were a printed page). But I suspect it's not going to happen before the mid-2020s at least.
It's nowhere near as bad as you've described. Even on the 960x1080 DK2, the readability of text has more to do with texture quality (both total resolution and good filtering for mipmaps). On a 2d display, the pixels are static on the display and text is mapped essentially 1-to-1 (or a fixed n-to-m when using subpixel antialiasing). But on an HMD, antialiasing is basically 'free', and the constant, minuscule bob of the head creates a dynamic, temporal aliasing that improves the readability of the text. We've known about the effect since the 80s or 90s, when psychologists figured out the dynamic environment of real flight improved the apparent eyesight of pilots over static images.
There are a lot of people who are doing text wrong in VR. You don't render at 10pt and expect it to look right. You have to pay attention and not just take the default settings.
I've spent much time reading and writing text in VR and it hasn't been a problem. If building a real, live code editing environment were my goal, I could have it done in a week. But I have different goals.
That's very interesting. My experience may be colored by the Oculus Mobile experience, which has a higher resolution than the desktop DK2 but a much weaker GPU and much less RAM to play with. (That's because the VR is rendered by the phone.)
Texture filtering on the Galaxy Note 4's embedded GPU is probably not a priority. On a 2560*1440 phone screen, who can even see those artifacts? So it could be that the hardware and drivers are taking quite a few shortcuts there, and those come to haunt on the Oculus.
It could just be the programmers who made whatever you've tried don't know very much about texturing. Most of the AAA games I've tried in VR have not had text that was even remotely readable, and it was all because it was rendered too small.
The Gear VR has a fairly decent GPU. The move to mobile was more about discarding legacy fixed-function pipeline techniques and more directly optimizing for shaders. The main factor limiting texture-fill is GPU RAM. But most devices have more than enough RAM to be able to render text well.
VR is a realism multiplier. Traditional 3D graphics techniques are realism fakers. You shouldn't apply many traditional graphics techniques, because they have perspective-dependent artifacts that are subtle to impossible to notice in mono still images but glaring in stereo motion. So I think it's folly to apply too much effort on things like displacement shaders or stereo textures. These sort of things both A) cost a lot of time, and B) look terrible in an unrestricted stereo view. There's even new research to suggest one's general hormonal balance can have a huge impact in whether or not these miscues are going to cause simulator sickness in a user. You could literally be alienating half of your potential users along gender lines.
In the face of that, I think it makes more sense to stick to simpler techniques that respect the Hippocratic Oath: first, do no harm. We've been able to write some 3D games that run at 150fps for over a decade now. But there is no 150hz display to pair it with. V-sync is probably the most important issue, followed closely by running at native resolution (so for a given GPU, a lower resolution screen might actually be better, because it will be easier to hit the full refresh rate at the native resolution) and clean, consistent antialiasing. It wouldn't be a problem if both eyes were rendered identically, but having the dual eyes highlights any visual artifacts that appear. Match those three issues and the realism of the content doesn't matter, it could be flat-color, Phong-shaded cubes and you'll have a great VR experience. Miss any of them and even high-end games like Elite: Dangerous that are gorgeous on 2D will look like complete garbage.
VR has completely inverted the priorities of graphics programmers. Because of this, I think the primary VR innovation is going to come from indie developers, because most established companies can't switch their focus away from their 2D-display oriented consumer base. The Call of Duty series has sold over 175 million units. Assume nearly $40 a pop, that makes over $7 billion. You're not going to see VR getting anywhere near that for a few years still, and companies like Activision and EA aren't going to chase after such small potatoes in anyway other than just PR.
Please elaborate! I am curious what your experience and setup is. I have zero interest in consuming media or games through VR but I am really wondering how it can be productive. Do you get more work done compared to regular monitors, or is it the same work while feeling 199% cooler?
Right now, the resolution is clearly not high enough for a straightforward implementation. I've got some ideas I'd like to play with around green-on-black text rendered directly in the post-lens-warp buffer, but I haven't had time to try it out yet.
Didn't SDK 0.6.0 remove some of the ability to do that stuff?
> Removed support for application-based distortion rendering. Removed functions include ovrHmd_CreateDistortionMesh, ovrHmd_GetRenderScaleAndOffset, and so on. If you feel that you require application-based distortion rendering, please contact Oculus Developer Relations.
Resolution of the final hardware will be 1080x1200 per eye (2160x1200 total) over a 90* field of view.
I'm not quite sure how to translate that into pixel density properly given the resolution is spread between your eyes, but I'd guess at least as good as a 1080p screen that fills a 90* angle of view?
The user experience would be quite different, as you could simply rotate+tilt the viewport to see sections of a larger virtual monitor screen. So it wouldn't necessarily be as restricting as low resolution on a fixed screen.
1080p that fills a 90* angle of view would be like looking at a 40" 1080p TV from ~28 inches away. Certainly not the same experience that you're used to with a retina display, unfortunately.
That's if you're looking at the full screen. Imagine walking closer, and having the resolution "increase" as you get closer (because you have the same number of pixels, looking at less "stuff" per pixel).
I'm not sure people would see the benefit of using it if you had to peer so close that each character took up a third of your displays just so it was at a comfortable resolution.
A readable 80x24 text mode is the bare minimum for programming IMHO. It's probably detailed enough for that, but it's not going rival your laptop/desktop experience any time soon.
I'd think the best approach for something like this would be to have higher pixel density in the middle of the display -- that is, where your head is looking -- and lower pixel density in the perhipheral. That's tough to do with an LCD, but it could be done with a specially designed projector or two different LCDs that you combine optically.
A podcast I listen to recently discussed the theoretical horror of flying while being stuck next to a passenger strapped into an Oculus Rift who is totally oblivious to how annoying they are being to their fellow passengers with their gestures and reactions to the VR experience that no one else is seeing.
From what I've seen the weirder reaction w/ people wearing GearVR on planes is when they are interacting w/ them through the pass-through camera, since people expect them the HMD users to be oblivious to the world.
>>A succubus is a female demon or supernatural entity in folklore (traced back to medieval legend) that appears in dreams and takes the form of a woman in order to seduce men
I believe that it was as much social as technical issues that held back Glass. Like it or not, we can surely expect social push-back on a VR device which will make its users appear to onlookers as almost masturbatory.
Besides resolution, you also have to deal with the fact that the oculus rift really only works well with a dedicated GPU. It's going to be a while before it's a legitimate "mobile" solution.
I got a DK2 and have barely touched it because my Macbook Pro 13" just wasn't adequate enough.
While this may be true for high fidelity gaming, for moving some textured triangles around, mobile GPUs are fine. Also, they can (and do) perform much better clock-for-clock vs a PC (and especially a Mac) thanks to the ability to optimize drivers. For a fascinating discussion of the optimizations possible, Carmack's 2014 Oculus Connect keynote is well worth watching: https://www.youtube.com/watch?v=gn8m5d74fk8
This talk was a slightly long but very informative discussion of the various challenges in reducing latency and dealing w/ PC rendering: https://www.youtube.com/watch?v=PoqV112Pwrs
Also, consider a future where a laptop is just a ~silicone skin over the current "bottom half" of a laptop. Batteries could also be smaller since the screen currently eats so much juice, or it could stay the same size to power the VR headset (but I imagine that the smaller screens would use less juice).
This will be somewhat problematic until there is some sort of a beacon that acts as a fixed reference point in front of you. If there is no positional tracking and it's just the headset's IMU, the plane's movement (e.g. banking) would affect the tracking.
Really? Is this actually a concern to anyone who's been programming for over a year?
The only remotely challenging part might be symbols on the number row (that's for me at least). And for those, there's actually NO reason you couldn't have a virtual keyboard you can look down at.
There's a bias here. Most of the users of this site likely use a keyboard as part of their daily jobs. Proficiency in touch-typing is essentially part of our jobs descriptions.
I've actually thought a great deal about this. I think the solution is a keyboard that exists both IRL and in the virtual space: some sort of cheap system to keep track of its relative position with respect to the VR headset would allow a user to look down and "see" the keyboard in virtual space as it exists in reality. There are a number of unknowns that would need to be probed if somebody were to pursue this solution directly, but I don't work with VR stuff so I can't see if it would actually be useful in practise.
I haven't tried it yet since I have no HMD but I read some good stuff about it. Don't know if it will run more than one monitor though but that feature seems obvious to me.
Like others have mentioned the big problem is the resolution at this moment. You have basically about the same res as a 1080p screen strapped to your face, so the res that you see will actually be lower than 1080p. Reading stuff will be more difficult as you can imagine.
I tried this for about 6 hours straight using the DK2, and there seems to be a sort of catch-22.
By default, the resolution is not high enough but your head stays stationary as in this default you can see the entire screen from the default head position.
Virtual Desktop allows you to zoom in, and this then works wonders. However, you need to start moving your head to see the corners of the screen. Naturally, your neck muscles become the first bottleneck.
What if focus followed window manager focus? I'm guessing there will be more innovation in window managers once VR like the Rift is available + affordable.
I've thought a lot about this as well. There's one big problem I see with this: your eyes would get really tired. The nice thing about having a real-world monitor is that your eyes get a lot of ambient light as well. So, if Microsoft can really develop and evolve their HoloLens, that could be one solution to this problem since ambient light would come through the transparent display.
Update: as other people have mentioned, the resolution is also not quite there yet with the Oculus to have a convincing virtual desktop.
Current HMD lenses collimate the light so you focus on infinity which is isn't too stressful for your eyes. There's still strain from vergence-accommodation conflict, but I think light field displays will outpace the holographic waveguide tech that MS is using.
Because of the initial low pixel density it'd make sense to have a billboard sized screen, which would solve that issue. There's no reason that you'd need the stereo disparity for reading text.
In practice, anything that's placed 10ft+ away doesn't cause much stress for me although that's probably somewhat personal.
I think within 2-3yrs displays that solve the focal plane issue will be commonplace so this will be a bit of a moot point.
Wouldn't focusing on infinity be a problem for people who are myopic? I imagine everything would be blurred, and I don't think glasses would be wearable together with the VR gear.
Yes it is a problem. Morpheus, Vive and CV1 provide a larger eyebox so you can wear eyeglasses. GearVR has diopter adjustments (something like -4 I think).
Hi. FOVE's screen is WQHD (2560*1440).
Also we are planning to make Iris recognition for securing privacy.
It will be suitable and very secured headset for personal monitor.
(bit.ly/FOVEKS)
Closer than you might think. At 4K, the arc density is about a 14" VGA screen, which was totally usable. At 8K you approach a 24" HD monitor. That's ignoring the additional resolution you get from high refresh rates.
At first glance I was thinking of the hell that it would be to run a current IDE on a 640x480 14" monitor. But at 1280x960 on a 28" monitor, it may work pretty well.
Do you? I can't imagine why anyone would want to strap something to their head that would make the programming experience more tiring than looking at a monitor for 6-8 hours a day, which this surely will. Ditto the speculation that this could somehow be a transformative social experience.
I remember a conversation with friend about twenty years ago (yeah, I'm old). We're talking about the new exciting thing, mobile phones. "There's no way anyone, save for a few weirdos or people who must have it because of their job, will willingly give up their privacy by having a phone in the pocket all day along!" - said the friend, I remember like it was yesterday.
I was talking about the danger of predicting success of some technology based on personal bias, not about similarity between a VR headset and a mobile phone.
If you have a VR headset, you may as well get a comfy reclining chair too. Or lie down on your bed/couch -- either way, your head can be supported by things that are not your neck muscles.
They better not dare making this Win10 exclusive and hosing people who plan to stay on Win7 for the foreseeable future.
People were all up in arms when facebook entered yet there was never an overlapping issue that would mean the Rift would be gimped or held hostage or anything. People who were suggesting FB login to use the Rift were quite frankly talking out of their behinds.
But this deal now with MS is awful. There was no need for Oculus to do that. None.
It clearly wasn't a money thing. It wasn't for the controller as they clearly work on their own one. What on earth was riding these people to go with MS when it is clear that they are fully focused on their console and tablets.
This does not paint a good picture to what the rift might be reduced to just to make it fit onto all of MS stuff. Pretty much like they are currently in the process of "consolification" of the desktop OS.
Ah, that explains why they call the Xbox controller "one of the best in the world" and shipping it with it, instead of just letting people decide for themselves or making it at least optional. This might actually stop me from getting one, I'm not sure how much a controller is normally but it will increase the price and I have a USB dualshock 2 controller right here which I prefer over the Xbox one.
They stopped both Linux and OS X development. Previously they said it was to focus on getting the best experience on the biggest platform first, and then expanding to OS X and Linux.
The Xbox controller is really surprising to me. Years in development and it will ship with an Xbox controller. Oculus Touch didn't even get demonstrated and "prototype" was mentioned time and again.
That really doesn't bother me. Having used an Xbox One controller regularly, I think it is a fantastic controller, and I like that Oculus didn't feel the need to reinvent the wheel. I am excited about new interfaces other than a traditional controller though.
VR definitely needs a reinvented wheel in this case. Being able to move your head in virtual space but not your hands is clearly suboptimal, and the motion sensors in that generation of controllers are not up to the task.
My take: this a bandaid, so they can get the headset out in Q1, ahead of the proper controllers in Q2. Not what they wanted to do, but Vive forced their hand on timing.
I'm curious as to why you think that freedom of hand motion is essential for VR. Certainly I can think of many VR experiences where it would be useful, but I can think of just as many where having discreet inputs (buttons) would be preferable.
For example, a spaceship piloting game that allows the players head to rotate and look out the windows could very easily be controlled with a traditional gamepad.
I definitely agree with you that the "cockpit" games (racing, space combat, et al.), and things like Lucky's Tale are well served by the current controller, but I don't think it's going serve VR well outside of those niches. Which isn't what you want from the default, bundled controller.
Even just a few years ago most people were thinking that game genre popularity would just map into VR - i.e. Witchers and CoDs would be king. What's becoming more and more apparent is that VR is such a different medium that we need a whole new set game genres and mechanics.
For instance: locomotion with the left controller stick, head movement with the right (which the current controllers are brilliantly adapted for) is such a core controller "idiom" right now, but when applied to VR it's just horrible - a recipe for nausea and disorientation.
If you look at the kinds of experiences that are being developed around Valve's controllers, they're all huge departures from the games we currently play, and the reason is that Valve, as game designers, realized from an early point just how far back to the drawing board we need to go. VR is just too different, with different objectives (immersion), possible modes of interaction, and human weaknesses to address.
Because the entire VR experience is a summation of its immersive elements. Immersion in one domain (control, sound, display) can make up for immersive shortcomings in another domain.
Interestingly enough, it looks like it will be wireless, which currently isn't possible with the PC. The Xbox One controller is only wireless for Xbox, the drivers/wireless receiver aren't available for PC. I wonder if they partnered with Microsoft to make it wireless?
It could have been possible to rebrand or restyle the controller to actually match the headset. It's a bit strange to buy something from them which you could get on a store shelf in Walmart for years. Make it exclusive somehow.
Since OR is under Facebook, and Facebook has been fairly Microsoft friendly in the past (opting for bing maps and some office docs stuff etc), it doesn't seem like a huge surprise, especially since Sony/PS4 has their own thing in development (Morpheus)
I guess this also explains the earlier announcement dropping OSX and Linux support.
I think it is more than Facebook being MS "friendly". Like it or not, Windows is still the dominant platform for PC gaming period. As much as many would like this to change, it is not even close to happening.
Fair enough but I don't think a display is the place to start when trying to change something like this.
I am sure the Vive will be great, but let's face it the vast majority of titles on Steam are Windows only. Valve may like the idea of Linux gaming, but they push a lot of Windows games...
Its not just the display. I'm presuming there will be a whole ecosystem of games designed with the Rift in mind. By making it Windows only you effectively tell all those developers not to bother developing their games for OS X or Linux.
That is just it though, is not a stupid attitude because the market that cares about this sort of thing and buys games is very small. Most of the complainers on HN do not play games. Over 95% of machines using the steam platform are windows.
The Xbox One controller is certainly good, but the mini-trackpad-button-hybrid on the PS4 controller is a real touch of genius in interacting with menus, when support is properly implemented.
Really? What games have you found that the trackpad has worked well on? I haven't played too many PS4 games, but every game that's implemented that trackpad hasn't done it in a way that I feel like it's been to my benefit.
So far games haven't really taken advantage of it, except to gain two more buttons (it can sense which side you're pressing down on). But the built-in keyboard uses it, for example, which makes typing a breeze.
The potential is there, it's just up to developers to actually use it. Keep in mind that the PS4 didn't even have a single truly good exclusive game until Bloodborne came out recently. It's taking this new generation of consoles a long time to really kick off.
The DualShock 4 is arguably even a better PC controller than the Xbox controllers. Its out-of-the-box support is decent, but it really shines when paired with ds4drv for Linux or DS4Windows (both open source tools), which allow you to play Xbox controller games, use the touchpad as a mouse, set LED colors, and create macros/profiles.
The X360 controller is/was the gold standard, the XBONE controller has more mixed reviews. I personally prefer Sony's DS4 to the XBONE controller (for next-gen controllers). But I still keep an X360 controller around.
Xbox 360 controller was the go-to reference controller for pretty much any PC-released games in last 5 or so years - mostly due to it's automatic integration with XInput protocol used by pretty much everyone. I guess the only limiting factor why it wasn't replaced by the One controller is the fact that the PC receiver hasn't been released yet.
Nintendo never made anything that would connect to PCs.
Really? I certainly liked the N64 and Gamecube controllers, but they're not exactly ideal designs for a "universal controller". The wiimote had kind of crappy motion controls and was borderline useless for anything that didn't involve motion. The Wii U uses upgraded wiimotes that suffer the same problem, and the gamepad is more like a giant gameboy than a controller. Plus we can't use any of those controllers on a PC without additional adapters or significant hardware hacks. The Xbox 360 (and to a similar degree Xbox One) controller is designed to be universal, and if you look at what people are using for PC gaming it's pretty much just those two.
The Wiimote is just a bluetooth device, there's no hardware hack or adapter necessary to connect it to a PC, you just need a driver. Not that I ever used it outside of an emulator.
Nintendo controllers are god tier in quality, reliability, and unique integration with games. When I see the Xbox controllers all I can think of are Microsoft Sidewinders.
I was equally surprised by the on-ear style headphones. I'm sure the design team thought about the choice in depth and has good reasons behind it, but I would have imagined over-ear to be better for immersion (personally I find them more comfortable as well).
The technical terms are supra-aural (on-ear) and circum-aural (around the ears). Both can be sound-wise excellent, but circum-aural is definitely more comfortable.
I used to own a pair of Joe Grado HP-1 HP-1000 supra-aural headphones and they were excellent and did a great job with binaural recordings. Since selling those, I moved on to the Denon D7000 circum-aural headphones. Far more comfortable. Bar none the best audio experience I have had was with AKG K-1000s hooked up to 75-watt tubes monoblocks. However, that wouldn't have worked for VR presence because of the latency from the tubes. With solid state speaker amps to keep the audio in sync, the presence would probably be unreal, assuming the K-1000s do well with binaural recordings, but that is something I'm uncertain of.
I'm not arguing about sound quality though. When I said "immersion" I was thinking about over-ear headphones generally having better isolation than on-ear.
Sounds quality isn't really dependent on circumaural vs supra-aural headphones. The major complaint I hear with supra-aural headphones is that they apply pressure on the ears, which can be unconfortable. Sound quality, however, is largely affected by the drivers.
Will these apply much pressure on the ears? Many headphones use the earpieces to stay in place, but these are positioned by the Rift's straps which should be able to do it without pressure from the earpieces.
It's not just listening fatigue from sound, but the pressure applied to the ear-lobes. I have used circum-aural headphones that I can use for hours upon hours since they were very comfortable (no pressure) and were not too bright or forward that they caused listening fatigue.
Yes, I have a pair of Grado S60 (supra-aural) which are by far my best sounding headphones, but I cannot use them for many hours on end for this reason, as loose as I try to set them, they eventually become uncomfortable.
What is really needed IMHO is a mind-reader. Something simple that would detect brain waves and allow you to move around. That way one could aim by turning your head and move by thinking about it.
I know tech. has some way to go before it gets there, but that is the missing piece that would make this the next level of computing (and probably one that would cause Half Life 3 to finally show up).
Head turning without thinking about it would probably induce motion sickness. I think something the earlier Rifts taught us (through low framerates, lack of head tracking, etc.) is that our brains freak out if there are any significant inconsistencies between our movements and what our eyes see.
I for one agree with you. They punted on the controller in order to make the release, and their prototype controllers look underwhelming in comparison.
It feels like they've cut corners in an effort to beat Valve's VR platform to market.
I was fairly disappointed by most of the keynote, but maybe my expectations were just too high?
I was hoping for some real innovation that elevated it above Vive, but it just didn't seem to deliver. The new controllers look great, but they aren't going to be around until mid 2016, whereas I will probably have the Vive ones in my hands by the end of year.
I think possibly Oculus are more focused on being consumer friendly than Valve, but the inner techy in me is certainly more in love Vive and their ecosystem at the moment.
The entire cinema mode integration with Microsoft just seemed like a total misstep to me - why would I want to play a game in a virtual living room? It just looked cheesy, gimmicky and silly. The fact you have to stream it over a PC as well and can't just hook the headset straight into the Xbox seems like it's just way more hassle than it's possibly worth.
I think the cinema mode is to ensure that you don't have to take the device off when you switch between a game that's designed to support the Rift and software that does not. With that it mind, it seems like a reasonable compromise. I mean, do you want to have to take the thing off every time you alt-tab to Windows Desktop, or do you want to be 2" from the Windows Desktop?
I think it's two fold. First, you can't just slap VR into existing games. It needs to be considered and implemented deliberately. Second, I don't think the XBox One powerful enough to drive an Oculus Rift. The recommended specs for the Oculus Rift are "NVIDIA GTX 970 or AMD 290, Intel i5-4590" or better. The XBox One doesn't have that CPU power and it certainly doesn't have that GPU power.
They're not shipping until Q1 2016? With all that funding? They're going to miss the 2015 holiday season? Zuckerberg needs to kick some ass over there.
Having all that funding means having the luxury of shipping a properly debugged and refined product, instead of having to chase the holiday season with a potentially bugged and broken one (which is far from an exceptional occurrence in the videogame industry).
Yeah, I was disappointed they didn't announce the price. I can understand as if it were expensive (which I expect it will be), it would probably be a distraction to their announcements.
Separating the controllers into 2 devices for each hand is a natural way to do input in VR, but I feel that the hand and finger motion should also not be limited by the need to hold the controller. The pointing and thumbs up gestures are ok, but imagine if the computer could respond to the full range of motion a-la Leap.
Maybe it would look something like a cone that's at its base wrapped around your wrist but at its height contains Leap sensors that can recognize the full 3-dimensional range of hand/finger movements.
But I think the intention with this device is to make something 'more' than just gestures. They wanted to make a device that can understand gestures, but also gives you more fine controls (buttons, analog sticks, grasping). It seems like they really want to get the basics right, and then expand on all sorts of forms of input in the future depending on the reaction and ideas of devs and gamers. In their way of thinking you could buy a leap motion right now, strap it to the headset and have complete finger tracking. The only downside is the probable fact that there will be more applications using Oculus' Touch device and some few novel unpolished apps using something like a leap motion.
Maybe there will be a device like you explained in the future.
Hope you'll give it a shot; Image Hands significantly reduces the kinds of jitters that can ruin a decent user experience. In case you haven't yet, be sure to calibrate and optimize your tracking: http://blog.leapmotion.com/troubleshooting-guide-vr-tracking...
Dragonfly has some pretty significant hardware advantages for passthrough and FOV, but the biggest challenge is really at the software level, which is common to all our devices. (Full disclosure: lead writer at Leap)
The LEAP is a really inexact input device. Used it for a project this winter and the only reliable input we found was the position and rotation of the whole hand.
Lead writer at Leap here; sorry to hear about your bad experience. We hit some solid tracking improvement milestones this spring with the 2.2+ series of core software builds, and there are some larger improvements on the way. If you're looking to revisit it at any point in time, be sure to try our VR troubleshooting guide: http://blog.leapmotion.com/troubleshooting-guide-vr-tracking...
If that doesn't work, you can reach out to me directly: acolgan@leapmotion.com
I think you overestimate the draw of device that isn't really market tested and doesn't yet have a lot to recommend it besides its newness. Christmas of 2016 will give it time for good games to have bubbled up and make the value proposition of the Oculus + supporting hardware much stronger.
Christmas is exactly the right time to roll out a product that has nothing to recommend it besides novelty and a ton of buzz. That's when all the people who know nothing about what they're buying go shopping!
Pushing it to Q4 2016 will give them time to develop some good apps for it, but it will also give competitors like Valve and Sony a year to build up an installed base. And "pay X hundred dollars for this novelty gizmo" is a much harder sell when the customers already paid X hundred dollars for a similar gizmo the year before.
It appeals to a very specific market at the moment. Most consumers will probably never experience VR on a PC at home.
The real seller to mass market will be a combination of integration with phone handsets (a la Gear VR) or console (Morpheus/PS4) devices. The mass market doesn't want a big, expensive black box chugging away in their living room.
PC gaming may be bigger than you realize - it's twice the size of the console gaming market and growing. PC might not reach everyone, but it's where the money is.
Not many considering you really want a GTX 970 minimum to keep framerate high enough - crucial to reduce motion sickness. Those new MSI GTX 970 4Gb nvidia cards look nice. I'd put the $400 down for one if I didn't just buy a coffee machine.
BTW, these graphics cards are the first I've seen which spin down to a stop for normal computing, the fans only kick in for 3d applications, great stuff. PC gaming and graphics is more amazing than ever.
The Star Citizen community is slavering over the Oculus and its competitors. I don't know what adoption rates will be but the population of the game's backers is approaching a million accounts, though some of those are surely alt "accounts."
I bring this up to say that I believe people are underestimating how much interest actually exists in acquiring such a device.
Does anyone know of a list of PCs that meet (or exceed) the recommended hardware specs for the Rift? Not that I'm going to run out and buy one of these machines in anticipation of the Rift's release next year, but I'm interested in knowing roughly what a premium VR experience costs today (excluding the headset, obviously)
Edit: To rephrase my question, can anyone point me to a list of PCs manufactured by companies like Dell, HP, Lenovo, etc. that meet or exceed these requirements?
Today you really need a Titan X or 980 GTX Ti for a premium experience across the board.
I don't think anyone can predict the future as there is currently so much optimisation going on across all the involved parties - Intel, Nvidia, Microsoft, Oculus, Valve, Epic, Unity, etc. They are all focussed on getting the best performance and experience out there.
This is a great summary of what Nvidia are currently doing:
Assuming Nvidia's VR SLI pans out, dual 970 TIs should get you 80% of the perf of dual 980 TIs for a cost comparable to a single 980TI. But, wait until VR SLI is demonstrated and tested in the wild. Until then, the added latency of AFR SLI is counter-productive to VR.
> To rephrase my question, can anyone point me to a list of PCs manufactured by companies like Dell, HP, Lenovo, etc. that meet or exceed these requirements?
Buying a prebuild is a bad idea. Apart from being overpriced, they often have strange custom motherboards and cases which make upgrades complicated.
There's always those places where you select the parts and they assemble the machine for you, but I can't recommend them. I bought a machine from Cyberpower UK a few years ago and had nothing but problems with it.
Just grab the parts from Amazon and assemble them. It isn't difficult, you're simply buying it in 7 parts and plugging them together. That way you'll know it's been done correctly, all the parts are standard, and you didn't pay over the odds.
Anyway, your question was for a hard price. I would budget $900 USD for all the parts (including 4690K CPU and GTX970 graphics crad). For a really high end setup for Rift (980Ti), the next step up would be about $1200.
a) Want something that physically looks good enough to have in their living room - modern prebuilds are sleek and small while making a small-form-factor custom is super-hard
b) Have more money than time. Who has time to track parts compatibility and figure out the upgrade path for your device, only to drop half of the cost of a new unit on your upgrade? There's a reason Apple has been so successful while ignoring upgradeability: the hardware market has never made upgrading easy-enough to be in-reach of the majority of users. While physically installing the parts is simple, compatibility is always frustrating.
Well shoot, upgrading rarely works well even for power users unless you're upgrading very often instead of only when necessary. By the time your CPU can't keep up, Intel has a new socket and you need a new motherboard. By the time your video card needs replaced, you'll likely need a new PSU to handle the new power requirements. RAM changes slow enough that you might get two or three upgrade cycles from your RAM, but you'll likely need more... and if you bought 4x1 GB and now you want 16GB, you're throwing it all away anyway. Hard drives are the only things that are easy enough to replace every day, but hard drive tech doesn't move that fast. Upgrade to even a slow SSD and you're golden for quite a while.
I bought my computer five years ago. After two years, the only things original were the hard drive, the RAM, and the case. After four years, only the case was original (and even that's trashed now as pieces and parts have broken).
Mini-ITX is easy enough. Micro-ATX is easy as pie. There are very nice cases for both. If you know of a sub-mITX OEM build that has 4690K+980Ti level performance, I'd like to see it.
>compatibility is always frustrating
Not really. A graphics card, for example, has two things to check to verify compatibility:
* Will it physically fit in my case? (Check the length)
* Is my PSU powerful enough and has the correct connectors?
That's it. Every graphics card has been PCI-E for a decade. If it fits and has power, any card will work in any motherboard.
CPUs are also not too obtuse. If you have an Intel 9-series motherboard, you can install any Haswell or Broadwell chip. This kind of stuff can easily be googled. It's no harder than getting the correct speakers for your home theatre.
Every graphics card has been PCI-E for a decade. If it fits and has power, any card will work in any motherboard.
You do want to pay attention to the mobo choice if the card itself requires PCI-E v3.0. There are still a lot of v2.0 mobos for sale. Most mobos have at least one 16x slot, which is the one you'll want to use. Things get a little more complicated with multi-GPU card builds.
>You do want to pay attention to the mobo choice if the card itself requires PCI-E v3.0.
Nope. PCI-E 3 capable cards work perfectly in PCI-E 2 motherboards. In fact, no graphics card currently on the market can significantly benefit from the boost offered by PCI-E 3. This includes Titan Xs in SLI: http://www.anandtech.com/show/7089/geforce-gtx-titan-twoway-...
>Things get a little more complicated with multi-GPU card builds.
Sure, but I would speculate that anyone going that route is an enthusiast who knows what they're getting into and is happy to do the research.
Nope. PCI-E 3 capable cards work perfectly in PCI-E 2 motherboards. In fact, no graphics card currently on the market can significantly benefit from the boost offered by PCI-E 3
Ah, OK. When shopping for a GPU compute server, I was paying attention to PCI-E 3.0 vs. 2.0 on the server motherboards.
I had assumed that it was a similar situation with consumer graphics cards these days, given how long v3.0 has been out.
Specifically, he got two GTX 970s to run in SLI, but the motherboard didn't end up supporting SLI. Neither him, nor the PC shop that built the machine for him, picked up on that.
It didn't help that the motherboard was branded as a 'Pro Gamer' motherboard (Asus H97-Pro Gamer: https://www.asus.com/au/Motherboards/H97PRO_GAMER/). How can something be 'pro gamer' without SLI support?!?!
Fails all around, but yeah you definitely need to be careful about the motherboard you pick.
I had my last gaming rig built at a small PC store. Not a big chain or online or anything, just one where I could walk in, sit down and talk to the guy about general components, pick all the pieces I wanted and then have them order them in and build them.
Build quality was great, he did a really good job with the cabling, I got all the boxes/manuals/extra parts. Have no complaints.
I've done some upgrades to that machine myself over time (RAM, graphics card, PSU, a drive change). PC assembly has gotten incredibly easier than it was in the past. It's the little touches. I notice that tolerance on cases are a lot better these days (things tend to align much more easily than they used to). PCI-E slots are a lot easier to put cards into than the older ones. RAM sticks generally have heat-spreaders on them, so you don't have to worry as much about handling them as you used to. Drive cables are easier (cabling in general is easier as most things are keyed these days), thumbscrews are almost ubiquitous, as are easily removable drive bays.
The only thing I still get nervous about is CPUs and their heatsinks. I haven't fitted a CPU in a long time and a mate of mine did my heatsink replacement for me.
All that said, I'd still pay someone money to assemble my PC from scratch. Someone good that assembles PCs for a living is going to do it way better and way faster than I would. At least, I'd hope so :)
I haven't built a computer since, roughly, 2006. It's just not very interesting to me. My computers today are an iMac and a MacBook Air, both of which work wonderfully. I normally play games on a PS4 or PS Vita. Again, they work seamlessly. I don't want to mess around with finding the optimal power supply. I'm too old for that.
If you did want to build a gaming computer again, probably the easiest thing is to find a build guide on one of the popular PC sites like Tom's Hardware. They often come out with several different builds in different price ranges. Theoretically, if you just order the same components, it should work with little trouble.
Early 30s, no kids, good amount of disposable income, and a keen interest in VR. How am I not in the target market for an ultra-high-end PC gaming peripheral? Because I don't want to spend hours figuring out what today's best set of gaming PC components are? That's silly. I am the personification of the "Shut up and take my money" meme with regard to buying a Rift.
well, for a start, by list you gave you don't own a PC per se (albeit apple is just rebranded overpriced PC with their proprietary OS, from my humble point of view). or do you consider PS4 a PC? Your post doesn't make much sense to me...
I normally build. But recently I was helping someone purchase a new desktop. I noticed that it is extremely difficult to match the price of some pre-built systems. What he ended up getting came equipped with a Gigabyte motherboard as well as a load of standard parts that I costed out to be more than the purchase price.
Neither Dell (Alienware) nor Lenovo sell reasonable pre-built hardware configurations.
Companies like http://www.ibuypower.com/ let you start with a base system and configure it until it meets the requirements. The recommended configuration costs about $1300, a premium one would be $1600.
I would imagine you'd get the best results from an Alienware style sub-brand of the listed companies; very few stock PCs come with the high-end GPU's you'd want to make the rift a good experience.
All in, a headset plus a machine to drive it will probably run you ~ $1,500 at launch, though the cost of the PC will come down over time.
Interestingly, Oculus has noted that the target spec. for the PC "will not change for the lifetime of the product." In other words, they seem to realize that this is expensive now, but that it's better to deliver a really high quality experience that gets cheaper over time than to launch with a sub-optimal experience that eventually fires on all cylinders.
This is likely going to be the biggest turn-off. People don't have those big PCs any more. Most people have just laptops/MacBook. I guess nobody would want to bring back those beasts in their living rooms again just to use Oculus Rift. They should have really considered full device experience where you get everything you need and hardware is exactly designed to give optimal Oculus experience. World has long moved on from knowing which graphic card you need for your PC. I really hope this has good enough performance on typical Lenovo/MBP at least.
> I guess nobody would want to bring back those beasts in their living rooms again just to use Oculus Rift.
If they like to play PC games in their living room, they already have a PC there. If they don't, this won't change that.
>"They should have really considered full device experience where you get everything you need and hardware is exactly designed to give optimal Oculus experience."
So... sell the Oculus together with a gaming PC? What's the difference between that and getting your own separately?
>World has long moved on from knowing which graphic card you need for your PC.
Except for those people who play video games on their PC. Those people generally do know what video card they have.
>I really hope this has good enough performance on typical Lenovo/MBP at least.
It will not. A powerful desktop gaming computer is absolutely required for acceptable VR performance in any interesting game.
The entire PC gaming industry would beg to differ, which is likely the initial target market. Of course that will change. Also I wouldn't exactly consider my desktop a beast. Form factors are pretty small these days.
Sure, this will be a hot product among PC gamers but they are not the mainstream consumers. I wanted to buy the device for my mom and have her use it. I think the vision should be to bring VR mainstream that is usable by all without technological friction.
Yea, I am sure that is the goal. But in order to provide a clean, seamless experience, you need to render what is roughly 3x the amount of data compared to normal 1080p monitors. You'll also want to make sure you don't have dropped frames as that would have a huge impact in the experience. You need high framerates and responsive feeback so that motion sickness is minimized. As you can see, there are a lot of reasons why a high-end graphics card is required. So yes, that is the goal but we can't get there right now.
The mainstream experience will be on high-end smartphones, not PCs. When Carmack and Facebook talk about getting a billion users into VR, this is how they plan to do it.
Something like this may become more mainstream [1]. Use your laptop on the go, and plug it into a high-power graphics card at your desk (along with monitor, keyboard, etc., and now Rift!)
It's strange, but when I think about it, DK1 was a much more positive 'product-y' experience for me than DK2 was. Perhaps part of that is that DK1 was my first VR headset, so much of the initial awe factor had worn off by the time I got the DK2.
The DK2 was/is plagued with software teething issues. Totally acceptable as it's a developer kit, but still frustrating and a little worrying. The other thing I found disappointing coming from DK1 to DK2 was the vastly decreased field-of-view. I think it's a combination of things that contribute/contributed to that (over-zealous vignetting in the earlier SDKs being one of them), some of which may have been fixed since in software. I say this as, to me, the Gear VR FOV seems wider than DK2's despite having a slightly lower quoted value.
My DK2 is basically a brick now. There's been a compatibility issue with AMD cards running it in extended mode, which goes all the way back to September of last year which has not been fixed on my system by any driver updates or Oculus SDK updates since. Given that the cool stuff I'd like to use with it (specifically VR Desktop and VorpX) doesn't support direct mode and only does extended, I've almost given up on it (this affects dev too - I believe the latest Unity editor still doesn't support running stuff in Direct-to-Rift mode from the editor, for example).
That's an acknowledged AMD bug which is fixed but I don't believe it's available yet. Oculus v0.6 apps can always just run in direct mode and avoid that problem.
They could've at least gone with Discourse. But I guess owing to the abundance of PHP talent at hand, they might've chose to stay with a widely used one.
I really like Discourse. But having looked into it quite seriously, it has a lot of gotchas, drawbacks, and limitations. The interface is very nice, but the feature set isn't as "mature" as other older forum software. It seems like if you're using it exactly the way they use it on the official demo/Discoure site, then you're good, if you need to use it in any other way then tough luck.
PHPBB is fine if you skin it well. Plus PHPBB's system requirements are extremely easy to meet (PHP and MySQL). Discourse requires a more niche tech' stack and requires a much more powerful machine to run. Good luck running it on a $10/month VM.
Many customers run Discourse sites on the $10/month Digital Ocean instance just fine. It is true that you can't go below that, though, as 1 GB RAM is the minimum.
I'm curious to see how they intend to solve the glasses problem. Ideally, they'd partner with someone like Zenni Optical to provide $10 drop-in lenses for any prescription.
Just like monitor manufacturers don't offer prescription glasses along with the purchase, Oculus won't offer lenses.
Corrective eye-wear is very individual and requires some amount of expertise( exam, specialist ), so it doesn't make sense, and it isn't very feasible, for Oculus to provide that for you. Also contacts are very cheap.
Monitor manufacturers don't make a device that straps onto your head. Apparently wide frames are an issue[1]. When I was playing airsoft, I had ESS goggles. ESS also sold prescription lenses[1] designed to slot into the air holes in the goggles, all you had to do was send them your prescription.
Contacts may be very cheap, but even with dailies, I wouldn't want to put in contacts just to game. I prefer glasses in general.
I'm a bit of a sound nerd, but the on-ear headphones are really underwhelming. No noise isolation, most likely low quality drivers... this would make for a pretty terrible VR experience.
Simply upgrading them to closed back, over the ear headphones would make a big difference - you'd be able to use lower volumes (at which point even cheap drivers aren't oto bad), and you'd be more isolated from the environment, similar to how the rest of the headset works with your vision.
The earphones are detachable and you can use your own audio headphones. As I understand the quality of these are actually pretty good, and definitely better than "gamer" headphones that the target audience might have. The reason they are on there is because they want to be sure everyone who has the rift has a decent audio solution, because positional audio helps so much with tricking your brain into being immersed.
The Crescent Bay prototype used the drivers from porta-pros, which are pretty good. The point of included headphones:
A) Based on Twitch and YouTube a lot of people didn't use headphones, losing most benefits of headtracked and spatialized sound. Integrating them makes them much more likely to be used.
B) Standardized frequency response and built-in DAC/amp means their HRTF (head related transfer function) spatialization tech can be tuned precisely, without worrying about the EQ of thousands of different variations. Though people are still free to use their own and lose the fine tuning.
Closed back headphones don't offer as much "soundstage" as open-back headphones. With open-back headphones things really sound like they're coming from outside of your head, which is critical for the 3d sound illusion.
So the way to signal that one is an evidence based aficionado of sound is to call yourself a "sound nerd?" In today's parlance, "audiophiles" are all deluded woo-woo proponents who are deserving of ridicule. (I find that to be a form of shallow prejudice: One can simply evaluate the science behind what someone is saying.)
That people's reaction to essentially the same comment would vary vastly with the form of self-identification. I suspect people are downvoting my comment just because it contains the word "audiophile." The point is that the topic is subject to a lot of unthinking prejudice in forums like this.
Since Oculus sold to Facebook I dropped them like a hot potato, and am instead embracing Valve's main platform for SteamBox the HTC Vive. That being said, where I think VR will really take off is going to a combination of VR and human brain interface (including gaze tracking) in order to get past some of the limitations of human interface design we have been stuck on for so long (mouse/keyboard/joystick)
-Tracking of the controller: There are leds on it that emmit IR light so the tracker can accurately track the controller. This accuracy is not achievable at the moment with just internal sensors (although the controller has these, presumably for even better accuracy)
-The inside of the ring has sensors to track your finger gestures. I can imagine this would not be possible with just the internals of the controller.
-The ring acts as a sort of extra handle. Imagine opening your hand totally (as in waving). This is a gesture you would maybe like to do in VR. The ring will hang between your thumb and index finger while you do this. Other more traditional tracked controllers would fall out of your hand.
Looks like they may have IR LED's around the outside that are used by the camera for positional tracking. Might also be used to track finger movements on the inside of the ring, but not sure there.
I am still not sold on VR, but I feel that AR (Augmented Reality) really is the future.
1) AR seems to help in real world situations vs VR which actually makes you feel vulnerable due to lack of vision and impaired hearing of what is around you.
2) VR also has the issue with physical issues with nausea and 10% of people with lazy eyes or other optical issues that make them unable to see 3D clearly.
3) Star Trek Holosuite is AR and that is my dream one day.
4) If Wii motion controls didn't continue their popularity how will making a helmet make that physical exertion more appealing?
But AR requires several technological leaps beyond VR in terms of transparent screens and location tracking. So I'd expect to see usable VR headsets to appear in homes a generation two before AR ones.
I mean, the HoloLens is additive. That means if you want to show a dark object in a room with white walls, you cant.
Most of the journalists w/ first person demos have given relatively poor descriptions, but Oliver Kreylos a UC Davis VR researcher recently published a very in-depth review of his experience w/ the Hololens hardware, including his observations on FOV, occlusion, etc: http://doc-ok.org/?p=1223
Hololens hands on show it to be a ready product. It doesn't require any breakthroughs.
One reviewer was shocked at how opaque the holograms were. I think the situation you're describing with a black object has already been solved:
>What did surprise me, though, was how little opacity there was to the holograms. I expected them to look somewhat transparent, with the background shining through. That really wasn’t the case, though. They looked really solid and there was almost no transparency.
There is a lot of money in the AR space right now, and the different technologies for AR right now take several different approaches.
1) CastAR (Eye Glasses that reflect back light to the viewer) looks really promissing: http://castar.com/
I seriously could see this taking over monitors for some people for computing especially if there is security or privacy issues.
2) Magic Leap (Sounds like Science Fiction, have little faith in their promis right now) Beams light into your eye in a way that won't make you sick or give you a head ache. Serious technology problem with silicon photonics production that has stumped Intel right now http://www.technologyreview.com/news/538146/magic-leap-needs...
3) Microsoft Future Vision 2011 - https://www.youtube.com/watch?v=a6cNdhOKwi0 Uses glasses to overlay a translation on signs (Traditional idea of glasses with transparent display) I find this to be a decade away easily.
Transparent screens are definitely the way forward, but don't discount the possibilities of camera-to-screen passthrough within Gen 1 VR devices. Ultimately the distinction between AR and VR is going to dissolve to the point where it's just different ends of a spectrum -- add enough objects and backgrounds to the real world, and you get a virtual world.
I wonder why we need transparent screens though. I see AR as a subset of VR, not a superset of it. Once you've solved the field-of-view, screen resolution and (I'd also argue) dynamic range/colour accuracy of VR, you add a front-facing, wide-FOV stereo camera to it and there you have it, AR with the dark object problem already solved for you.
AR and VR serve different goals. In this page they mention presence, which is basically the internal lizard-brain feeling of being somewhere else. Presence is what makes VR something unique, different, and powerful. AR doesn't provide presence.
I think AR is infinitely more practical for practical things but ultimately less useful as an entertainment medium. And the holosuite is just as much VR as it is AR.
I agree with your view that AR is more the long-term future of all this tech. However, I would disagree that VR isn't going to be a big thing in the short to medium term.
It's a completely new medium to a large extent that is still finding its feet. It offers a completely new* experience and developers/artists/storytellers have yet to really tap into what it can do and deliver in terms of feeling and emotion. Everyone is still trying to make old methods suit the new format - you could see this just from the fact that Oculus demo'd a 3rd person game. Once teams start to fully embrace the medium and the hardware has caught up, I honestly believe it will deliver some very innovative and emotional experiences that will lift people out of their real life environment. Its uses will be much more widespread than just gaming - training, relaxation, education, social interaction, etc.
* When I say new I know it's been knocking around for decades, but the tech has only just reached a point where it is somewhat viable.
I'm with you on your second statement, not on the first.
VR will be clearly more than just movies. And AR is far from just for smartphones. Granted HoloLens isn't ready and might not be that great, but it has so much more potential than just smartphone gimmicks. A good set of AR can completely replace my work setup, I wouldn't have the need for monitors anymore. I'm excited by that idea, a new way to interact with the world.
I don't think you understood his comment so here goes. VR is like going to the movies, where you are focused on a single thing and everything else isn't important. The people beside and behind you are dark and you don't focus on them, you focus on the movie. AR is like using a smart phone, you use it to talk to people, find restaurants and reviews, direct you via a map, tell you about locations around you, so many things can be done with smartphones. AR is that, using the world around you to give you more information.
> AR seems to help in real world situations vs VR which actually makes you feel vulnerable due to lack of vision and impaired hearing of what is around you.
AR is also better suited for games and simulations that have more than a handful of controls. Flight simulation, for example, is next to impossible with an Oculus because it blocks your view of the controllers.
Flying around is fine when you have hands on the stick (or yoke) and throttle. But when you need to operate the landing gear, flaps, lights, radios or navigational instruments, etc you need to find the right button on your controllers or keyboard. That doesn't work with the Oculus.
I'm looking forward to try out the CastAR in the future. It looks like a very nice solution for home cockpit builders.
I agree with your assessment. AR is more ubiquitous VR is more like SecondLife 2.0. Some of the engineering and design outcomes are noteworthy, but that's about it.
Second Life 2.0 doesn't sound very exciting to me. I really see VR as being a short hit and quit activity. Touring some place in VR in the classroom seems great but more than 10 minutes and most will quit whatever the activity is.
I am still willing to give it a try but on the outside looking in I find this as long term success stories as Second Life and the Wii.
I feel VR will be strictly a gaming toy and even then will be something of an eccentric item considering the cost/hassle of it all, not to mention all the side-effects. AR seems promising, and this hands on review has me really interested in Hololens.
Its crazy that we have a wearable hologram headset that produces real world opaque holograms but no one is really talking about it. Yet an ugly VR black box mounted on your face projecting a cell phone into your eyes somehow gets all the press. If anything, Oculus shows us how gamer-centric the tech press is.
Hololens is pretty much a wearable holodeck. The money I had earmarked for the Vive or Oculus is going to sit in my savings account until the Hololens comes out. I doubt I'm the only one sitting this first round out. AR might just strangle VR in the cradle - and that's a good thing. Let's face it, Oculus didn't firesale itself to the evils of Facebook because it had a bright future. It did so out of desperation.
I'm really excited about VR, have a DK2 and am really impressed with the experiences available already. But, I've found it very hard to use for long periods of time (more than 20-30 minutes) as my eyes eventually focus on the screen door and I can't seem to break that without taking some time away from it.
That was my immediate thought as well. Their choice of a scruffy looking guy (nothing wrong with it in and of itself) says a lot about their intended audience.
Hard to imagine Apple or Google marketing a product with that kind of imaging.
I don't have any problems with you being scruffy looking :), I'm just saying a lot of people do have a reaction to it, which is why most companies go out of their way to use clean cut people.
Essentially it just flashes the frame briefly then switches the display to black. This helps with the tiny motions your head and eyes make between refreshes which normally cause your brain to not respond well because your movement isn't matching what you are seeing.
They also implemented something called Time Warping which is very complicated but it's basically a depth-aware frame interpolation that let's the device create way more frames than actually are which helps again with keeping your brain from freaking out about the discongruity of your head motion and your vision.
I used the rift with the full head tracking this year at PAX East and felt no motion sickness at any point. A lot of the issues in 2013 were fixed when they started tracking full head movement instead of just rotation.
Of course I don't think I'm prone to motion sickness and the demo lasted only about 20 minutes max but I believe that I could wear it for over an hour or more with zero adverse effects.
> I used the rift with the full head tracking this year at PAX East and felt no motion sickness at any point.
You're one of the lucky ones. I've tried all the Oculus devkits as well as some other VR solutions and I get terrible motion sickness within minutes and it lasts for hours after I'm done with the game. This is a very personal thing but anecdotal evidence suggests that this is a major issue for a significant portion of people.
Oculus DK2 was a major improvement, though.
The Oculus is a very personal device, you'll have to adjust the lenses and the headgear to suit your eyes, and then it might be off for someone else.
This is a real issue they're going to have to solve, so far they've been excused because their products have been called "development kits". But if they start selling a final product that will make half or a quarter of people sick, they'll be getting bad reviews and reputation.
I'd really like to be able to use a VR solution, it's a competitive advantage in the kind of games I play (racing and flight simulation), but nothing I've tried so far works for me.
I heard Jaron Lanier speak about this 30 years at Xerox PARC. And the general principles of VR have changed little since then. Computers are at least a million times faster now. So you don't puke moving your head and the computer is a half second behind.
1. Oculus is a display, therefore it should work with almost anything (aka modern devices that put out HDMI, etc). A 4k monitor doesn't require me to buy a whole new computer to use it. I don't understand why Oculus (or any other VR display) needs to be proprietary to the point that I may end up owing one device (Oculus) for PC and another (Morpheus) for PS4 gaming. This makes as much sense as companies launching nearly identical video formats (BluRay vs HD-DVD most recently). And again, this is a display not unlike a 4K display and should work similarly universally, I would think/hope.
2. It seem too big to me to be truly immersive. To lose oneself you have to not notice you are wearing this, which at its current size seems unlikely.
3. It needs to be wireless, any cables will detract from the immersive experience. I'm sure they are working on this.
4. The controllers they created look pretty interesting but I don't see how they will be conducive to touching, picking up, grasping items which would seem to be a more important thing than re-inventing FPS controllers if VR is truly going to take off this time.
5. I worry that too much focus is being placed on adapting FPS to this (and other) VR tech. While it would seem a 'natural', and probably will prove to be the easiest style of game to adapt, it's also lazy. It will be far more challenging (and I think rewarding) to adapt VR for other purposes such as viewing live events (concerts, sports etc), driving games/sims, sports games/sims and action adventure games such as Witcher and Dark Souls, just to name a few off my head. Maybe the new controllers will work great with those types of games/activities but my gut is these are really optimized for FPS, which is a shame.
It's exciting times for VR but I sure hope it doesn't get overtaken by FPS games because there are so many more potentially awesome applications out there to also focus on.
To react to all your points view with my points of view:
1. It's not a display, it's a device that contains a display. It has motion and tracking sensors. What the rift wants to achieve is the latency from input (moving your head) to photon to be really really low. This problem is hard. And to solve it you need to create your own solutions first. In an ideal world everyone would work together to make the best "Rift" that works on all devices, but that is not how capitalism works. This does not mean that we will not see this in the future. Some companies will win the VR battle, and they will probably be the ones to provide the best HMD for multiple devices. Hopefully we will go to a world where we have a HMD standard, but that is just not how new technologies evolve.
2. The Rift is lightweight, and people who have tried the Rift don't even notice the HMD on their head after a couple of minutes.
3. The technology isn't there yet (or too new/not mature) to do it with as low latency as with a cable, but like you said it will be eventually.
4. Agreed. Although they have a grasping button I'm not sure if it will mimick grasping convincingly enough. Playtesting with the actual device will tell us more I think.
5. I think we will see a lot of games that are not FPS oriented to be honest. I didn't see any FPS announcements (except for the dogfighting in space game but that's not really FPS to me). I'm personally not too worried about this.
@evo-9, All of your observations and concerns suggest a really superficial understanding of VR in general and Oculus in particular.
Do a Google search for recent lectures by John Carmack. If you have any serious interest in the field, you won't find a more rigorous or illuminating introduction.
When I finally tried OR DK2 last month it didn't meet my expectations. Latency was not an issue, but the fielf-of-view was simply to small to be convincing. It felt like looking through a toilet paper roll sawed in half.
I'm just waiting for some internal fallout in the company between execs so I can make a well-timed "rift" joke... It's interesting to see these images now in comparison to the leaked ones that came out.
I've used a couple phone controllers, and the problem always comes back to no game really supports them because not everyone has one. The best games for a controller are the Modern Conflict games, and even then they have quicktime events that require me to drop the controller and touch the screen.
At the very least 8-bit era emulators will be able to capitalize on them well since those games just require 2 buttons and a D-pad, which a single hand of this controller supplies.
Somebody shoot the graphic designer for laziness. If you are making promotional artwork for a VR headset, try not to have the view in each eyepiece being the same picture slid horizontally. Surely it wouldn't hurt to do two renders.
Years in the making and the final industrial design of the thing is just a slight step up from a cardboard box. Awful foam around the headset, crummy plastic on the headbrace, flimsy open headphones, and it looks like it will still be unusable for anyone that wears glasses. I'm still grateful to Oculus for kicking off all of the interest in VR, but I doubt I'll ever be buying their devices if this is the best they can do.
Be aware, this man is who has bought and the OR type system, and is likely to buy another. I'm not saying this to demean, but to put color to the world where ORs are commonplace. If the OR is going to be common, then this picture will be a common sight as well. People complain about friends always texting and looking at their phones, if the OR becomes widely used, then this picture will be the new complaint