Wednesday 20 January 2010

Arduino and accelerometer harmoniously come together in DIY music controller

Look, Physical Computing can be a drain. Particularly when your Summa Cum Laude status is hinging on you acing the final. We're guessing that one Ryan Raffa managed to pull off a pretty decent grade, as his final project is nothing short of delectable. In a (presumably successful) attempt to wow onlookers and professors alike, Ryan cooked up an audio controller that utilized an ADXL 335 accelerometer (for motion sensing) and an Arduino board that communicates serially with Max MSP. The controller itself boasts inputs for five tracks and the sixth button applies a delay to all of the tracks; he was even kind enough to host up the Max MSP and Arduino code (it's there in the source link), and if you're interested in hearing what all the fuss is about, be sure to hop past the break and mash play.




Physical Computing - Final Project - Max MSP Controller from Ryan Raffa on Vimeo.

Saturday 9 January 2010

Microsoft names Natal release date

The Xbox gaming system that uses the human body as a controller will go on sale by the 2010 "holiday season".

Microsoft Entertainment and Devices President Robbie Bach made the announcement at the opening of the Consumer Electronics Show (CES) in Las Vegas.

Natal allows gamers to play without touching a controller.

It uses a 3D vision system to monitor players' movements and facial expressions.

Friday 8 January 2010

A fresh supply of thoughts about sciences of the artificial and artificial things that think.

Our creations are becoming increasingly intertwined with computer science and technology drives zeitgeist. Intelligence is explored through artificial intelligence and computation. We create complex machinery and through it understand our own.






Since activation in 2007, ThinkArtificial.org has enjoyed having articles featured and referenced on sites including Robots Podcast, Robots.net, Massively, ReadWriteWeb, io9, Games Alfresco, Jamais Cascio’s Open The Future, Slate Magazine, Absolut Vodka’s Machines campaign, Popular Mechanics RU , CNET news’ Crave, and many others.

Think Artificial was a finalist in the SXSW Interactive Web Awards ‘09; an award presented by Adobe Systems Inc. The site’s visual design (Gray Matter) is also ranked in top 20% out of 17,000 design-websites at Command-shift-3 (voting is public).

Please use Think Artificial’s Contact Form to get in touch. A physical mail address for press kits, products, etc. is available in certain cases upon request.

Google Goggles - Photo recognition in real-time for augmented reality info

Here’s something fresh from Google’s oven: the Google Goggles app for Android phones. Despite my let down when I realized they weren’t real Goggles, this is a mark of things getting interesting. Mobile AR apps are mutating and shifting into various forms and possibilities of the tech are certainly starting to form a big picture in the heads of developers. It’s here to stay allright.

The image recognition tech sounds exciting—image search and recognition in real time! I wouldn’t be surprised to see Google and Apple go heads on in a bloodsport match as they race towards the AR advertising market (incidentally bringing with them a wave of exciting apps and even AR goggle interfaces. Real ones.).

But, it’s best to let the video do the talking (read: I’m lazy). Here’s Google Goggles.


Tuesday 1 December 2009

thinking....

Project Natal unveiled as new 'controller' experience

look at Milo, which uses Project Natal for amazing interaction with an on-screen character.


Monday 30 November 2009

Sony Patents 'Emotion Detecting' PS3 Technology


Motion controls? Apparently the future of videogame controls lies not in flailing arms, but in emotional expression. Siliconera reports (via Joystiq) that Sony has filed a patent for a new technology that uses a camera and microphone to detect a player's emotions, including laughter, sadness, anger, joy, and even boredom.


As you can see in the diagram above (which we're pretty sure comes directly from a scene in Idiocracy), this new technology will read metadata derived from a camera and microphone (the PlayStation Eye, we presume) to know your emotional response to what you're seeing, such as potentially laughing at a humorous portion of a game. We can only assume this means the technology will also be able to read the terror you immediately feel when you realize your PlayStation 3 was able to know you were laughing.

According to the patent filing, this technology identifies emotions by reading facial expressions, as well as group behavior such as two players high-fiving each other. Boredom, for instance, could be detecting depending on whether someone is "looking away from the presentation, yawning, or talking over the presentation."

Exactly what applications Sony has planned for use with this technology in videogames isn't included in the patent (specific intents usually aren't included in patent filings), so it's hard to say how serious Sony is about someday using this tech, or how they would even plan to use it in the first place.