

Cool 3D ... music machine playing...
"A Computer Generated 3D Music Machine that plays a good tune with drums, guitar , piano, chimes, cymbals, bells, and many other musical instruments using lots balls. ~This is the complete version! Meaning nothing has been cut!"
I just came back from a Metaverse1 workshop in Spain. Many people have asked me how I made my Avatar to look so similar to me... well it was simply. I used Cyber Extruder. The following video describe the entire process in Second Life lingo. The technical side of it is fascinating: Using a reagular flast 3D passport picture (with good light) to make a 3D representation based on similar model. See also the www.cyberextruder.com.
Last year the company has received a patent.
CyberExtruder.com, Inc., a software company specializing in the field of computer vision, has been granted patents from both the United States Patent and Trademark Office (USPTO) and the World Intellectual Property Organization (WIPO). The patents, entitled "Apparatus and Method for generating a three-dimensional representation from a two-dimensional image," protect the company's process for automatically creating a 3D model of a person's head from just a single 2D image.
According to the press realease:Google seems also to be connected with this effort according to this news item:
This royalty-free standard will be developed under the proven Khronos development process with a target of a first public release within 12 months. Any interested company is welcome to join Khronos to make contributions, stand for chair, influence the direction of the specification and gain early access to draft specifications before public release. The working group will consider various approaches including exposing OpenGL and OpenGL ES 2.0 capabilities within ECMAScript. The Khronos Accelerated 3D on Web working group will commence work during April 2009. More details on joining Khronos can be found at http://www.khronos.org/members/
In a statement, Google engineering director Matt Papakipos said: "With more and more content moving to the web and JavaScript getting faster every day, the time is right to create an open, general-purpose API for accelerated 3D graphics on the web."
This week, researchers from Philips Electronics plan to describe a jacket they have lined with vibration motors to study the effects of touch on a movie viewer’s emotional response to what the characters are experiencing.The key to this technology is the fact that activating the Jacket is done via a virtual worlds (or even a simple DVD). The codes are stored and activated at the right time and intensity. This is one of the core areas of connecting real and virtual worlds as reflected in the future MPEG-V standard.“People don’t realize how sensitive we are to touch, although it is the first sense that fetuses develop in the womb,” says Paul Lemmens, a Philips senior scientist who will be presenting research done using the jacket at the IEEE-sponsored 2009 World Haptics Conference 2009, in Salt Lake City.
The jacket contains 64 independently controlled actuators distributed across the arms and torso. The actuators are arrayed in 16 groups of four and linked along a serial bus; each group shares a microprocessor. The actuators draw so little current that the jacket could operate for an hour on its two AA batteries even if the system was continuously driving 20 of the motors simultaneously.
So what can the jacket make you feel? Can it cause a viewer to feel a blow to the ribs as he watches Bruce Lee take on a dozen thugs? No, says Lemmens. Although the garment can simulate outside forces, translating kicks and punches is not what the actuators are meant to do. The aim, he says, is investigating emotional immersion.