I have to say, it doesn’t get much more exciting in this field than what’s occurring from an equipment viewpoint over the next year. I’ve talked about the touchless Leap Motion interface already (it’s due to be released pretty soon) and then there’s the Oculus Rift as arguably the first high quality consumer VR headset. Both of these are likely to form part of my PhD research, but there’s a third piece of the puzzle that while unlikely to be used in my research, will fulfil a long term desire in regards to gaming.
It’s called the Omni and it looks like it’s going to bring an affordable human movement option. I’ve always loved the idea of being able to get fit from gaming, and the Omni may just pull that off if it achieves what it’s planning to.
Have a look for yourself and note that the headset is the Oculus Rift, not a component of the Omni:
John Carmack created this, and set me on the road to gaming obsession
John Carmack is a bit of an icon in gaming circles, and he’s also one of the people that’s supporting the Oculus VR consumer headset that’s on the near horizon. I’d very stupidly assumed (having not read any biographical details on him until today) that he wasn’t that deep into the coding / science of things like this.
He’s just posted a nice piece of work on the challenges of latency in virtual reality. If you’re from a computer science background you’ll get a lot more out of it than I did, and even I could appreciate just how critical latency is in this sphere.
Latency is of course an important consideration anywhere but Carmack shows just how far we probably have to go to make VR headsets that give an accurate perception of real-time movement in physical space. It’ll happen of course – and I still want an Oculus now.
An amazing piece of development by a public/private consortium. It’s an obvious evolution from tablet computing and although it obviously has a long way to go, the implications for virtual worlds-based education is obvious, particularly in the health field where hardy devices are needed that can easily be cleaned.
In 5 to 10 years I can see options like this achieving exactly that. If the price point ends up being reasonable, it’s also a great option at the pre-registration training level. Taking purely a virtual worlds perspective, if I can load up a good immersive simulation on something like the Papertab, hand it out to participants at the beginning of a session, walk them through the simulation and allow them to walk out with it, I have to be be making some inroads. That’s without the collaboration and interaction options already being shown between each Papertab. Of course, it’ll depend on how successfully the developers can move from the gray-scale version on offer now, but I’d imagine that’s a given over a period of years.
Have a look for yourself:
The full press release:
Cambridge, UK and Kingston, Canada – January 7, 2013 — Watch out tablet lovers — A flexible paper computer developed at Queen’s University in collaboration with Plastic Logic and Intel Labs will revolutionize the way people work with tablets and computers. The PaperTab tablet looks and feels just like a sheet of paper. However, it is fully interactive with a flexible, high-resolution 10.7″ plastic display developed by Plastic Logic, a flexible touchscreen, and powered by the second generation Intel® Core i5 processor. Instead of using several apps or windows on a single display, users have ten or more interactive displays or “papertabs”: one per app in use.
Ryan Brotman, research scientist at Intel elaborates “We are actively exploring disruptive user experiences. the ‘PaperTab’ project, developed by the Human Media Lab at Queen’s University and Plastic Logic, demonstrates innovative interactions powered by Intel core processors that could potentially delight tablet users in the future.”
“Using several PaperTabs makes it much easier to work with multiple documents,” says Roel Vertegaal, director of Queen’s University’s Human Media Lab. “Within five to ten years, most computers, from ultra-notebooks to tablets, will look and feel just like these sheets of printed color paper.”
“Plastic Logic’s flexible plastic displays are completely transformational in terms of product interaction. they allow a natural human interaction with electronic paper, being lighter, thinner and more robust compared with today’s standard glass-based displays. this is just one example of the innovative revolutionary design approaches enabled by flexible displays.” explains Indro Mukerjee, CEO of Plastic Logic.
I spent some time today reading some fascinating (offline) information about the 1918-1919 influenza pandemic and later on stumbled across an interesting article by Robert Bloomfield on the recent H1N1 (Swine Flu) outbreak. The take home message:
If you look at it this way, epidemics provide something of a ‘perfect storm’ for virtual worlds. They generate a strong demand for high-engagement distance collaboration (especially education for children and telecommuting for parents), and cause little damage to our infrastructure. So they are the right place to start.
It’s hard to argue with, although in a high-stress pandemic situation, it’s easy to see how people would naturally revert to to more hardy technologies like the phone. Time is likely to change that perspective though.
The Economist has an interesting article on the use of Second Life in training people around virtual consent. It’s an area where a virtual world has strengths (the ability to recreate a real-world situation) without some of the downfalls (lack of clinical complexity due to technological limitations).
Another aspect that has struck me over the past year is just how strong the UK is in virtual worlds and health. The Imperial College London (where this training exercise is being hosted) is one of the shining lights worldwide, with a number of other UK institutions undertaking research.
Emergency preparedness training usually requires closing down facilities and calling people off of work—but conducting emergency drills virtually is less disruptive and more convenient, says Colleen Monahan, the study’s principal investigator. Nearly all public health departments in the U.S. train workers for emergencies through traditional tabletop exercises and simulated real world drills.
Nothing beats the real thing but in the disaster preparation space that’s not usually an option so the virtual world route can be a very useful adjunct.
Dubbed ‘Razorback Hospital’, it’s purpose isn’t clinical in nature, more a prototype of how technologies like motion sensors and RFID can assist the functioning of a hospital. It’s all about health logistics.
The goals of the project are:
1. Explore the future of Healthcare Information Technology (and other applications)
2. Explore ubiquitous computing – Internet of Things – Everything is Alive – the Matrix
3. Explore modeling and simulation
4. Explore smart objects
5. Explore how to represent business processes (workflow, plans)
6. Explore software architectures – system of systems
7. Explore connecting a virtual world to the real world (cross world correspondences)
8. Explore new ways to teach and collaborate