Link Between Violence and Games Refuted

kingsmouth2The evidence has been there in dribs and drabs for years, but a recently published study puts the hole issue to bed in a commanding way:

This article presents 2 studies of the association of media violence rates with societal violence rates. In the first study, movie violence and homicide rates are examined across the 20th century and into the 21st (1920–2005). Throughout the mid-20th century small-to-moderate correlational relationships can be observed between movie violence and homicide rates in the United States. This trend reversed in the early and latter 20th century, with movie violence rates inversely related to homicide rates. In the second study, videogame violence consumption is examined against youth violence rates in the previous 2 decades. Videogame consumption is associated with a decline in youth violence rates. Results suggest that societal consumption of media violence is not predictive of increased societal violence rates.

So there you have it. Those who have worked in the field aren’t likely to be surprised, but the general public will have their preconceptions pretty heavily challenged by this. What’s your take?

Virtual Worlds and Learning Presentation

Second_LifeWith thanks to Sarah Jones via the SL Health group, this should be a great session. It looks like the time zones may actually allow me to be at this one which is great:

The University of Texas at Arlington College of Nursing will present a program with Dr. Dee McGonigle (SL: Houstonccn) speaking on the topic “Use of the Virtual Environment for Learning”.

Date: Wednesday, September 3, 2014

Time: 2:00-3:15 pm SLT

Location: The UT Arlington Conference Center in Second Life, UTArlington III, SLURL:

More information:

Dr. Dee McGonigle, PhD, RN, CNE, FAAN, ANEF, is a Professor and Chair of the Virtual Learning Environments (VLEs) at Chamberlain College of Nursing. She is a Fellow in the American Academy of Nursing and the NLN Academy of Nursing Education.

She co-founded the Online Journal of Nursing Informatics (OJNI) and was Editor in Chief from 1996 through 2013. She is an active researcher and through her grant writing, has received over $870,000 in funding. She is an active researcher, presents internationally and nationally, and co-authored two textbooks: Nursing Informatics and The Foundation of Knowledge, one of Jones & Bartlett’s best sellers and AJN’s 2010 Technology Book of the Year, Integrating Technology in Nursing Education: Tools for the Knowledge Era.

Dr. McGonigle has written more than 100 publications including work books, book chapters, and articles. She is a member of both the Informatics and Technology Expert Panel for the American Academy of Nursing and the Serious Gaming and Virtual Environments special interest group for the Society for Simulation in Healthcare. She is searching for a way to facilitate translation by helping those who know (researchers) and those who do (clinicians) communicate and share. Her current areas of interest are in the nursing informatics competencies and diffusion of innovative technologies, especially those impacting learning.


Oculus Rift Dev Kit 2: More Simulation Promise

There’s no shortage of people eagerly following the development of the Oculus Rift, and I’m definitely one of them given I see the Oculus as central to my PhD research. In case you hadn’t heard, the second version of the Oculus Developer Kit is slated for a July 2014 release. It contains a bunch of improvements, not least of which is a much nicer looking unit.

Have a look for yourself:

The position tracking improvements are particularly exciting from a clinical simulation viewpoint, as it the improved quality of what you’re seeing. To now be able to lean forward / backward / sideways and have it reflected fully means that a lot of body language cues can now be recreated virtually. Add in significant body language options and you have a huge improvement to reality of the scenario.

Now that the Oculus crew are owned by Facebook, they’re thinking pretty big, claiming they hope to create a billion-user VR MMO. There’s been some big claims in this space before, but this one’s up there.

Diving In

maze-smhIt’s been a positive 12 months in regard to my research – although none of the actual data collection has started and it’s probably a while off yet before it does.  That said, I made some serious inroads into understand the body of literature surrounding virtual environments and clinical simulation.

I’ve also submitted my first journal article on the topic – it’s a ‘state of play’ article with a focus on what needs to occur to convince educators that virtual environments can be a key component in their simulation arsenal. The article reviewers have given some great feedback and the final version should be submitted in the next week. Then it’s on to getting the first draft of my lit review structured and written.

It’s been a rewarding year on two other fronts. First, I have three great supervisors who are very committed to the research area and have been a huge help so far. Second, I’ve had some great chats with the developer who’ll be helping me build the simulation. We’re actually setting up a site to record our journey, which I’ll obviously flag here when it’s ready to go.

For those of you still beavering away in the field – I hope it’s a rewarding year for you!

[Image via]


Space Glasses: Merging Physical and Virtual

Sorry for the lack of updates, but my studies have been moving along fairly well and I’m getting closer to building the simulation I’ve been thinking about. One aspect of it may involve technology like SpaceGlasses. Have a look at the video and you’ll see pretty immediately what it’s application may be to a virtual world-based simulation:

I have no doubt the examples shown there are buffed up for the video, but I’m hopeful the SpaceGlasses will deliver a further tool in the consumer-priced virtual reality arsenal.

Release date is scheduled for January 2014 so we’ll soon see.

The Omni: Another Piece In The Consumer VR Puzzle

I have to say, it doesn’t get much more exciting in this field than what’s occurring from an equipment viewpoint over the next year. I’ve talked about the touchless Leap Motion interface already (it’s due to be released pretty soon) and then there’s the Oculus Rift as arguably the first high quality consumer VR headset. Both of these are likely to form part of my PhD research, but there’s a third piece of the puzzle that while unlikely to be used in my research, will fulfil a long term desire in regards to gaming.

It’s called the Omni and it looks like it’s going to bring an affordable human movement option. I’ve always loved the idea of being able to get fit from gaming, and the Omni may just pull that off if it achieves what it’s planning to.


Have a look for yourself and note that the headset is the Oculus Rift, not a component of the Omni:

The team are still currently raising funds – they were seeking 150K and close to 900K has been pledged, so there’s no shortage of interest. Check out the full details including some further videos here.

Latency and Virtual Reality – The Deeper Stuff

John Carmack created this, and set me on the road to gaming obsession

John Carmack created this, and set me on the road to gaming obsession

John Carmack is a bit of an icon in gaming circles, and he’s also one of the people that’s supporting the Oculus VR consumer headset that’s on the near horizon. I’d very stupidly assumed (having not read any biographical details on him until today) that he wasn’t that deep into the coding / science of things like this.

He’s just posted a nice piece of work on the challenges of latency in virtual reality. If you’re from a computer science background you’ll get a lot more out of it than I did, and even I could appreciate just how critical latency is in this sphere.

Latency is of course an important consideration anywhere but Carmack shows just how far we probably have to go to make VR headsets that give an accurate perception of real-time movement in physical space. It’ll happen of course – and I still want an Oculus now.

Papertab: Washable Computing?

papertabAn amazing piece of development by a public/private consortium. It’s an obvious evolution from tablet computing and although it obviously has a long way to go, the implications for virtual worlds-based education is obvious, particularly in the health field where hardy devices are needed that can easily be cleaned.

In 5 to 10 years I can see options like this achieving exactly that. If the price point ends up being reasonable, it’s also a great option at the pre-registration training level. Taking purely a virtual worlds perspective, if I can load up a good immersive simulation on something like the Papertab, hand it out to participants at the beginning of a session, walk them through the simulation and allow them to walk out with it, I have to be be making some inroads. That’s without the collaboration and interaction options already being shown between each Papertab. Of course, it’ll depend on how successfully the developers can move from the gray-scale version on offer now, but I’d imagine that’s a given over a period of years.

Have a look for yourself:

The full press release:

Cambridge, UK and Kingston, Canada – January 7, 2013 — Watch out tablet lovers — A flexible paper computer developed at Queen’s University in collaboration with Plastic Logic and Intel Labs will revolutionize the way people work with tablets and computers. The PaperTab tablet looks and feels just like a sheet of paper. However, it is fully interactive with a flexible, high-resolution 10.7″ plastic display developed by Plastic Logic, a flexible touchscreen, and powered by the second generation Intel® Core i5 processor. Instead of using several apps or windows on a single display, users have ten or more interactive displays or “papertabs”: one per app in use.

Ryan Brotman, research scientist at Intel elaborates “We are actively exploring disruptive user experiences. the ‘PaperTab’ project, developed by the Human Media Lab at Queen’s University and Plastic Logic, demonstrates innovative interactions powered by Intel core processors that could potentially delight tablet users in the future.”

“Using several PaperTabs makes it much easier to work with multiple documents,” says Roel Vertegaal, director of Queen’s University’s Human Media Lab. “Within five to ten years, most computers, from ultra-notebooks to tablets, will look and feel just like these sheets of printed color paper.”

“Plastic Logic’s flexible plastic displays are completely transformational in terms of product interaction. they allow a natural human interaction with electronic paper, being lighter, thinner and more robust compared with today’s standard glass-based displays. this is just one example of the innovative revolutionary design approaches enabled by flexible displays.” explains Indro Mukerjee, CEO of Plastic Logic.

What’s your take on this?