Quantcast

VR and Latency: Carmack\’s Thoughts

\"OhThis post originally appeared over at our sister site Metaverse Health.

John Carmack is a bit of an icon in gaming circles, and he\’s also one of the people that\’s supporting the Oculus VR consumer headset that\’s on the near horizon. I\’d very stupidly assumed (having not read any biographical details on him until today) that he wasn\’t that deep into the coding / science of things like this.

He\’s just posted a nice piece of work on the challenges of latency in virtual reality. If you\’re from a computer science background you\’ll get a lot more out of it than I did, and even I could appreciate just how critical latency is in this sphere.

Latency is of course an important consideration anywhere but Carmack shows just how far we probably have to go to make VR headsets that give an accurate perception of real-time movement in physical space. It\’ll happen of course – and I still want an Oculus now.

Daden Unveil Oopal

Oopal (pronounced oo-pull) is Daden\’s latest offering: a web-based editor allowing you to place and edit objects in a 2D environment, which will then roll out to the 3D environment (currently OpenSim and Second Life with Unity3D support coming in the next 6 months). Watch this brief walkthrough video to check it out for yourself:

OOPAL Quick Introduction from DadenMedia on Vimeo.

The full press release from Daden:

Birmingham UK, 27th June 2012: Educators and trainers can now create engaging immersive learning exercises more easily and rapidly using an innovative web-based application called OOPAL, developed by learning and visualisation specialists Daden Limited.

OOPAL (Object Orientated Practice and Learning) lets educators and trainers with little technical knowledge use the web to build 3D sets from an existing library of objects, and create, edit and manage the scenarios and simulations entirely from the web. Only when they\’re ready to deploy do they need to enter the 3D virtual world and \”push the button\” to materialise the sets and exercises ready for students to use. With OOPAL, educators – and even students – can create and maintain worth-while learning experiences without needing to be virtual world experts.

Daden have been creating immersive learning experiences since 2008. Built on the success of their award winning virtual learning authoring software PIVOTE, Daden\’s second generation system, OOPAL, makes exercise creation and maintenance significantly simpler – making it easier to involve tutors and even students in the design and build process.

David Burden, Daden\’s Managing Director says \”We found that the easiest way to describe immersive learning experiences was in terms of a drama – thinking about actors and props, the script and their behaviours rather than abstract concepts like nodes and links – and we\’ve designed OOPAL to reflect that – considerably easing the process from exercise design to implementation.\”

A key feature of OOPAL is that it allows educators to lay out the 3D environment using a simple 2D \”kitchen designer\” type layout tool. Drawing from a library of props and virtual actors, educators can assign behaviours to each object – how they will react when touched, pushed, spoken to or approached. Dialogues can even be assigned to the virtual actors for use within the simulation. Users can build just a single room or even a whole environment. What\’s more – once they have built their set and simulation they can create multiple copies in their virtual world – again at the touch of a button.

Fundamental to the use of OOPAL, within a professional learning environment, is its ability to log and time-stamp every student interaction within the exercise. This can be reviewed within OOPAL, or exported in whole or part to a VLE or LMS. OOPAL also supports scoring mechanisms for in-exercise feedback.

David says \”One of the obstacles in the adoption of immersive environments for learning has been the need for educators to be experts – not in their field of study but in building within virtual worlds. OOPAL dramatically reduces that barrier and gives educators and trainers the tools to create real-world learning experiences for their learners in a 3D environment\”.

OOPAL can be accessed as a cloud-hosted service from Daden, or installed on an organisations\’ own servers. OOPAL currently enables exercises to be developed in both OpenSim and Second Life. Daden plan to release a version for the Unity3D, and a web/iPad player in the next six months.

So what do you think? My initial impression from watching the video is that it would simplify things to some extent though the technical knowledge required is perhaps still a little high for some people. Personally I\’ll be really keen to see the Unity version to see what it brings to the fray.

Euclideon pops its head above the parapet

\"\"In August last year I posted the last of a few articles on promising new graphics technology called Unlimited Detail. As I posted there, the team were going to ground to work on getting the technology to a stage where they have something even more substantive to show off.

That may be a little while off yet, but xbigygames.com has an interesting piece on how Euclideon are doing. A snippet:

As mentioned when Euclideon was first revealed, this technology is something they plan to utilise not only for video games but also scientific research. Supposedly there will be “some Euclideon products released in non-games related industries over the next few months”. “There turned out to be a lot of demand for our capabilities across quite a few industries, so we have tried to put that demand in order and address each area one at a time. As soon as we have revenue coming in, we can expand our team into different departments to deal with each industry,” Dell tells us.

“I think it’s fair to say that people are starting to accept that the future of 3D graphics is atomic,” he finally points out. “Polygons will still be around a bit longer as an editing tool, but I don’t know how much longer they will remain for visualisation. So many games today have polygons that are so small that they are only a few pixels in size. When polygons become smaller than the 3 corner points that make them, there is no point in treating them like triangles anymore and it makes sense to use atoms instead.”

On the question, when we will get our next look at Euclideon powered gaming, all Dell responds is, “Well there is soooooo much I’d love to say about that, but I’m afraid that I’m sworn to silence at this point in time. My apologies, but I think you’ll find it worth the wait.”

So things are still progressing and we should start to see some implementations of the tech before the end of the year by the sound of it.

Thanks to Phillip Street for the heads-up!

Federal Virtual Worlds moving beyond Second Life

\"\"Several years ago, the National Oceanic and Atmospheric Administration maintained more than a dozen virtual environments for online visitors to explore in Second Life. Now it operates just one.

For NOAA and other federal agencies, the focus of virtual world activity has moved beyond Second Life and diversified onto other platforms and gaming engines, according to Eric Hackathorn, a 3D Web designer for NOAA and one of the federal pioneers in virtual worlds.

“Virtual worlds are in need of some rebranding,” Hackathorn told Federal Computer Week. “Historically, virtual worlds were synonymous with Second Life, but that is no longer the case.”

While several agencies, including NOAA, NASA, Defense Department and National Library of Medicine maintain a presence on Second Life, several current initiatives have shifted to open source and in-house platforms and interagency efforts, he said. For example, DOD\’s PTSD Experience invites users to learn about post-traumatic stress disorder.

“There is a lot of activity and many different use cases,” Hackathorn said, with initiatives for training, innovation and research in 3D and gaming environments.

The upcoming Federal Consortium for Virtual Worlds’ annual conference starting on May 16 will highlight some of those programs.
See on fcw.com

\’3D Virtual Campus Tours\’ gains traction

\"\"

I had a note from Andrew Hughes, Adjunct Instructor at the University of Cincinatti and head honcho of Designing Digitally Inc, on the success to date of their 3D Virtual Campus Tours product. Mirror worlds are of course well established and were one of the original ways universities and business have utilised virtual worlds.

Universities in particular are an obvious market, in that new students have a genuine interest in learning how to find their way around, to which virtual environments are ideally suited to help out.

I shot some questions back to Andrew Hughes to get some more information on where 3DVCT sees itself in the marketplace and where it sees its unique value is.

Q: What was the original impetus to develop specifically for campus tours?

A: We have built over 30 campuses inside Second Life, Opensim, and other virtual worlds only to find that we\’re not thinking about the convenience factor for novice users. On the web we give around 2 seconds for a website to load before we move on. We were looking to build a browser-based campus and it just so happened that the United States Air Force Academy was looking for a virtual campus tour that was online and completely a browser based replica of their campus. We won the contract and have built a browser-based high end campus tour with built in communication tools and live and AI guided tours.

Q: What sort of response have you had to date from universities, including international universities?

A: We launched the product in March of this year. With the build we have done with the United States Air Force Academy, they have had four thousand recruits through the space at this current time. We have a handful of Universities both in the USA and outside the USA we\’re building now but we are under NDA\’s with them and cannot disclose their name, the nature of the campus\’s needs etc until they are launched on the client\’s website.

Q: We have a lot of readers who are educators: can you give a little insight on the platform 3DVCT is built on, including how easy it would be to implement at a university with more restricted IT infrastructure?

A: We\’re using the Unity 3D gaming engine and a complex MMO system that is connected to a dynamic server or servers. The development of the system has a complete content management system for users, history, macros for the tours, and even the ability to control the AI bot and what she says within the CMS. The databases are able to fully integrate into an existing CRM or ERM software used by the university so that there is one streamlined process.

We work hard to learn the process from day one of a potential student to the date he or she signs up for the first class. We then build the system to be as integrated and as easy as possible for the University. We also have extensive experience in building in Unity 3D to the extent that we\’ve been quoted by their CEO for our talents. The reason I state this as we can change the ports used to adhere to the client\’s specifications. We also can cloud the system so that it loads faster and is a little less processor heavy on the end user.

Q: Obviously it will vary but can you give a ballpark cost for a standard university campus tour from development to implementation? And how do you think this compares to other options in the marketplace?

A: Our company is very good at what we do and so we\’re in line with any other completely custom built browser-based virtual MMO developer. We also do pricing per enrollment size. So a smaller college will get a discounted rate depending on the pricing of the current student enrollment. Right now there is not a virtual world focused on just giving virtual campus tours. Right now in the industry other virtual campus tours are 360 panoramas or Google overhead maps. An experience like that won\’t let the student see how big the dorm room is compared to his or her size, nor would it allow them to actually walk around a to-scale campus to see where everything is and get familiar with the campus by actually walking around in it or talking to a live admin rep through voice and text chat we have built in.

Our virtual space is built in the high end gaming engine called Unity 3D and has had over two years of R&D built into it, so that the process can be done quickly and at a professional level you cannot get from Second Life or any other virtual world out there.

Q: What arguments would you make for your platform as opposed to say a university going it alone and developing an OpenSim grid on which to mirror their university and conduct tours?

A: We have the ability to do the following, unlike the virtual worlds you speak of above:

1. Full AI Technology
2. Control over the avatar experience
3. Custom ability to change ports
4. Higher quality of development
5. Runs in a web browser
6. Does not have a large learning curve to get into the world
7. Fully customizable both interface, experience, branding, etc.
8. Ability to be skinned and placed on your website for full ownership
9. Full content management system for the ability to control bots, users, history user tracking where they were, etc.

This is far beyond what those other platforms could ever do – I state this as we\’re well known for our SL and Opensim builds and we found that we cannot recruit students effectively.

Q: What are Designing Digitally\’s plans for the coming year?

We are working on 3D training simulations, and virtual worlds for government and corporate clients. Many of them are either under NDA or classified government projects. We will be launching a financial literacy system for people to learn how to manage money, buying houses, etc. This will include both Flash and Unity simulations within it. We are also going to be going to the following conferences:

- ASTD 2012
- SALT 2012

3DVCT will be at:
- Noel-Levitz 2012
- NACAC 2012

———-

So there you have it – the 3DVCT product has hitched its wagon firmly to the Unity3D platform, an obvious trend in the simulation field in particular. For what it\’s worth, the time I spent checking out 3DVCT further reinforced to me the responsiveness of Unity3D. It\’s not the panacea for everything but it\’s dominating some key virtual world niches – which lays down a significant challenge to competitors. That can only be good for the ongoing evolution of the industry.

On the fly 3D surface reconstruction: KinectFusion

Microsoft\’s Kinect is rightfully getting a lot of attention from researchers. One snippet that caught my attention is a collaboration between Microsoft and a number of UK and Canada-based researchers. The result is KinectFusion.

Have a look for yourself:

The implications for virtual worlds are fairly obvious. The thing that particularly struck me is the dynamic capability of the approach even at this early stage – if something changes with the physical world environment, it is reflected virtually. For the education, science and health fields, to name three, this is huge.

One obvious example within my pet area of clinical simulation: a camera (with consent) is placed in a busy emergency department in a large teaching hospital. Emergency nursing students based at a rural university receive that feed, had it convert on the fly to 3D for use within their virtual learning environment. Students may actually \’work\’ a full shift virtually, needing to respond to the challenges of the changing environment as they occur.

As I said, there\’s a long way to go (for starters, KinectFusion is about surfaces only), but the progress is rapid and exciting. Over to you: what applications could you see this being good for?

Euclideon\’s Unlimited Detail: a hands-on

\"\"In recent days I wrote about the latest video released by Australian developers Euclideon, who are behind the \’Unlimited Detail\’ engine. In that article I claimed the video was a pretty effective rebuttal of some of the criticism / cynicism amongst the gamer community in particular.

Thanks to a convergence of schedules and geographies, I actually had the opportunity to have a hands-on with the engine myself on Friday night. CEO Bruce Dell, having just gotten off a plane from the UK, spent some time talking about his recent trip to Gamescom in Germany, the work he has on his plate and the level of interest the engine is receiving. Then it was onto some \’play\’ time. After 10 minutes or so of navigating the demo (the same one shown in the video), a few things struck me:

1. The absolute smoothness of the navigation experience

2. The fidelity of the graphical experience

3. It was all done on a bog standard PC laptop

4. If the same level of quality and smoothness continues after full animation capability is integrated, that this is going to be one groundbreaking piece of technology.

5. If good consumer content creation tools are integrated with the engine, current virtual environments such as OpenSim and Second Life should be very, very concerned. Or at least be looking at licensing the technology.

I for one am excited to see what comes out the other end of Euclideon\’s self-imposed media blackout over the coming months. As I said to Bruce on the way out from our meeting: he should make the best of the time out of the spotlight, because if he pulls off what he\’s aiming for, it will be the last time he\’ll have that luxury.

Photo courtesy of Phil Testa.