Virtual Reality Is Not Just For Facebook: Chinese Military Pilots Train In 3D

In a 2014 exercise, People’s Liberation Army Navy Air Force (PLANAF) pilot cadets train using virtual reality (VR) headsets to providing 3D imagery, flatscreen monitors doubling as Heads Up Display (HUD), while also using civilian Saitek X52 video game joysticks. Given the poster with a JH-7 strike fighter in the background, the pilots could belong to a PLANAF maritime strike squadron. In addition to providing a cheaper, immediate step before stepping into an aircraft simulator, the 3D VR headsets allow for an augmented, integrated group training to improve interactions between the trainees and instructions. The PLANAF’s introduction of VR technology at such an early stage of pilot training is part of a wider Chinese trend in increasing accuracy and rigor in all stages of military training.

Touchable 3D technology unveiled to enhance video games, driving, 3D printing

A tech firm in Japan recently introduced a technology that would make 3D images a little more tangible. Thanks to the use of haptic technology, which is used in everyday objects such as video game controllers and smartphones to create vibrations simulating real-world touch-based interactions, Miraisens has created touchable 3D technology.

The “3D-Haptics Technology” uses a virtual-reality headset and wrist-mounted box connected to a fingertip-attached molding, coin-shaped molding, stick or pen. This setup allows a user to “feel” virtual objects – the resistance of a button, for example.

“This technology will give you a sense that you can touch objects in the 3D world,” according to Natsuo Koda, chief executive of Miraisens.

There are a number of possible applications for this touchable 3D technology, ranging from the casual (e.g., enhancing the realistic nature of video games and virtual musical instruments), to the artistic (e.g., replacing a mouse or tablet with natural gestures when creating 3D-printed objects), to the mechanical (e.g., remote controlling robots and feedback-based driving).

Miraisens wants to commercialize the technology via the electronics and services industries. A software development kit (SDK) and a content development kit are being prepared by the company with the hopes of launching a closed beta at the end of the year and an open beta in spring 2015. The company will hold a technical workshop in the U.S. in November.

Novel Material ‘Celleron’ Could Revolutionize 3D Bioprinting for Regenerative Medicine

One of the most exciting frontiers is that of 3D bioprinting, which is highly anticipated and could reveal potentially game-changing biomedical technology. One of the biggest issues prohibiting 3D bioprinting technology is how to make true 3D representative biological structures? But this technology could be about to take an open-source turn, which could accelerate and advance bioprinting as a feasible regenerative alternative to traditional treatments.

Engineers at Swansea University in Wales, have developed a revolutionary biodegradable tissue scaffold. This new biomaterial, called Celleron, comes as both a liquid biopolymer and a filament derivative. The development is led by Dr. Dan Thomas of Swansea University’s College of Engineering, able to 3D print Celleron and replicate the complex underlying structures of complex tissue architectures.

Post 3D printing, Celleron ferments when a biological activator is added, which causes it to become microporous in nature. This ensures that there is a massive increase in surface area and mechanical strength, and that there is a path deep into the structure for the migration of cells. Protein growth factors are then saturated into the porous scaffold to turn it into a biologically attractive composite.

This top-down 3D printing process allows for the fabrication of an accurate structure, so there could uses in a number of applications. Earlier this year the Swansea team bioprinted the complex geometry of a child’s ear from Celleron.

The Swansea team firstly engineered the 3Dynamic Alpha 3D bioprinter. These workhorse 3D bioprinters are currently used by many researchers from across the world to deposit a range of biologically supportive materials.

Next year the Swansea team plan to share this technology for biopolymer formulation and 3D bioprinting processes with researchers from across the world. This is in order to ensure that there is rapid acceleration of this technology so that the area of 3D bioprinting technology can be refined quickly. If successful then hopefully this technology will one day make a significant difference in the lives of many people.

The 10 Most Photorealistic 3D Renderings

Technology has turned the world upside down. It facilitates daily lives and substitutes our perception of the reality.

Computer graphic artists are up to completely blow our minds. Instead of going to the city streets and capturing cityscapes, landscapes or portraits with professional cameras, they decide to generate the same image, using 3D design software. And hurrah! Guess, they say, is it real or fake? We keep guessing, while surfing all around the virtual world.

Damn,sometimes you sit for 5 minutes and:

1) Ask yourself if is it a photo-shot or a trick by a CG artist;
2) Try to convince yourself that technology can do that;
3) Come back to the idea that it must be a photo-shot, because computer is not that clever to restore the reality in digital;
4) You give up.

NEXT GENERATION CHARACTER RENDERING

Next-Generation Character Rendering is there. It is the culmination of many years of work in photorealistic characters.

The challenge goes beyond entertaining; it’s more about creating a medium for better expressing emotions, and reaching the feelings of the players.

This amazing technology brings current generation characters, into next generation life. At 180 fps in a Geforce GTX 680 !

News : Weta Digital is now using GPU in production

No doubt that Weta Digital is one of the world’s premier visual effects companies, they  are known for uncompromising creativity but also for commitment to developing innovative technology.

In the last Siggraph they announced that their project Manuka finally goes to production, after years of research. It started life as a small R&D research platform and has grown rapidly into a hard core very serious production renderer.

Manuka objective is of course to renders scenes beautifully and accurately, but also to  handles the vast (utterly vast) complexity requirements of the company and it does so incredibly efficiently and quickly.

This new production renderer has already been used for the third Hobbit film, some shots in The Desolation of Smaug and some shots in Dawn of the Planet of the Apes. It was able to handle very complex scenes like for example, in Dawn when the apes ride into San Francisco and tell the humans to stay out of the forest in the “show of strength” scene, the vast amounts of apes – each with complex hair that would have taxed any renderer.

apes2_ss036_0310_Take-214_1028Manuka is totaly focused on production but for a first time can acheive a higher level of quality by implementing algorithms like BDPT (Bidirection Path Tracing) or Advanced shaders that were previously to costly for a production renderer.

The following image show a test image that use the hairs shader but also show that Manuka is able to efficiently and accurately render caustics effects.

manuka_assetUser_edeon_Take-420_0001

 

 

News : AMD GPU rendering technology

AMD_ray_tracer_Maya_FirePro_mech_shot4

Siggraph is surely the place to be for the ones who love all the 3D technologies and annoucements. One of the most interesting one is surely the new GPU renderer developed by AMD, able to run on both CPU but also on any GPU.

It is the first time we see a “production” renderer able to run on any kind of hardware.

One of the AMD’s technology evangelist, Takahiro Harada, showed a small demo of this new rendering technology, already integrated Inside Autodesk Maya, thanks to the 4 x FirePro W8100 video card.

This renderer is already able of Advanced features and provide real time feedback, but more than a work, a small video of the demo.

Of course, we expect to have more détails at the next Siggraph…