Video: Lightdraw, Alpha 2
Friday, April 18, 2008
Wednesday, April 16, 2008
TIME.com Interview with G.R.L
Graffiti Meets the Digital Age - Interview with James Powderly and Evan Roth of Graffiti Research Lab
http://www.time.com/time/video/?bcpid=1214055407&bctid=1483830664
http://www.time.com/time/video/?bcpid=1214055407&bctid=1483830664
Monday, April 7, 2008
Week 14 - The Teapot from 1975, Utah.
0407
I've stumbled upon a few videos related to the OpenTouch project that I've mentioned last Friday, 0404.
Introduction to MacLibre and OpenTouch
http://video.google.com/videoplay?docid=-3884258697819652860
Topics covered:
Compiling OpenTouch on Linux and Mac OSX
oscpack
Some related files posted on GSoC file repo
http://groups.google.com/group/google-summer-of-code-discuss/files
TUIO: A Protocol for Tangible User Interfaces
http://tuio.lfsaw.de/
0408
Today I experimented with Quartz Composer mainly because it has the potential for lots of eye candy and also because it is the easiest way for me to create a splashTop with widgets. I was a little reluctant at first, but this beats having to do the aesthetics in OpenGL where it could potentially eat into my development time.
So I did a little exploring and within minutes I was able to get a cloud to follow my mouse pointer. Then Kevin took this a step further and threw in a Utah teapot that rotates with respect to the cursor location. However, there was a latency between the laser point and mouse cursor, mainly because my Lightdraw application was calling X11 API to move the pointer. This would be slow since X11 is not the native windowing system on Leopard OS X, and also because Quartz Composer was polling the mouse pointer location on every loop.
I would think that the simplest way to resolve these issues is to send the laser coordinates from Lightdraw to Quartz Composer directly, via OpenSoundControl (OSC), much like what OpenTouch uses.
0409
Finally hooked up Lightdraw and Quartz Composer to communicate via OSC protocols. I shall not have to endure choppy X11 on Mac anymore!
To do:
0410 - Thanks Ben!
Ben from Apple Singapore came over to visit the Lightdraw crew. He kindly assisted us by cleverly introducing a smoothing function to our Quartz Composer Utah teapot demo, and that disguised the visual disruptions caused by unstable motions from a user's trembling hands. I suspect that I could implement the smoothing functions in Lightdraw to correct such visual disruptions too. After which Kevin and I prepared a few demos for tomorrow's video recording and also for future presentations.
0411
Recorded our first Lightdraw video with an entire length dedicated to lazer applications. That's right, its lazer with a "z". The demos recorded involves:
These demos are more interesting now becauseof flashy graphics they show how we can easily extend the laser interactions to applications with practical uses.
Here's the link to our previous video. Pardon the crappy music loop. I suggest that we may try using loops from fffff.at, with thanks and credit where credit is due of course.
Outstanding tasks:
I've stumbled upon a few videos related to the OpenTouch project that I've mentioned last Friday, 0404.
Introduction to MacLibre and OpenTouch
http://video.google.com/videoplay?docid=-3884258697819652860
Topics covered:
- MacLibre
- Multitouch introduction
- OpenTouch concept
- OpenTouch modules description
- OpenTouch gestures
Compiling OpenTouch on Linux and Mac OSX
oscpack
- Copy /oscpack/MakefileLinux over /oscpack/Makefile
- Change "INCLUDES = -I./" into "INCLUDES = -I./ -I./ip/" !!!important!!!
Some related files posted on GSoC file repo
http://groups.google.com/group/google-summer-of-code-discuss/files
TUIO: A Protocol for Tangible User Interfaces
http://tuio.lfsaw.de/
0408
Today I experimented with Quartz Composer mainly because it has the potential for lots of eye candy and also because it is the easiest way for me to create a splashTop with widgets. I was a little reluctant at first, but this beats having to do the aesthetics in OpenGL where it could potentially eat into my development time.
So I did a little exploring and within minutes I was able to get a cloud to follow my mouse pointer. Then Kevin took this a step further and threw in a Utah teapot that rotates with respect to the cursor location. However, there was a latency between the laser point and mouse cursor, mainly because my Lightdraw application was calling X11 API to move the pointer. This would be slow since X11 is not the native windowing system on Leopard OS X, and also because Quartz Composer was polling the mouse pointer location on every loop.
I would think that the simplest way to resolve these issues is to send the laser coordinates from Lightdraw to Quartz Composer directly, via OpenSoundControl (OSC), much like what OpenTouch uses.
0409
Finally hooked up Lightdraw and Quartz Composer to communicate via OSC protocols. I shall not have to endure choppy X11 on Mac anymore!
To do:
- Standardize usage of OSC protocol communication.
- Learn more about Quartz Composer. Find out if persistent variables are possible. (DONE, use Javascript)
0410 - Thanks Ben!
Ben from Apple Singapore came over to visit the Lightdraw crew. He kindly assisted us by cleverly introducing a smoothing function to our Quartz Composer Utah teapot demo, and that disguised the visual disruptions caused by unstable motions from a user's trembling hands. I suspect that I could implement the smoothing functions in Lightdraw to correct such visual disruptions too. After which Kevin and I prepared a few demos for tomorrow's video recording and also for future presentations.
0411
Recorded our first Lightdraw video with an entire length dedicated to lazer applications. That's right, its lazer with a "z". The demos recorded involves:
- Rotating an enironment mapped Utah teapot
- Moving a rendered image of cloud
- Navigating Mozilla Firefox
- Interacting with Google Earth
These demos are more interesting now because
Here's the link to our previous video. Pardon the crappy music loop. I suggest that we may try using loops from fffff.at, with thanks and credit where credit is due of course.
Outstanding tasks:
- Multiuser splashtop
- OSC multiuser protocol
- $1 gesture recognition module
- Profiling
- Auto calibration
Wednesday, April 2, 2008
Week 13 - Back to my blogging routine
0331
Here I am back to my blogging routine. I've just finished my interim report and handed it to Kevin for review. Meanwhile, I'm reading up on calibration references, primarily homography which should correct keystone distortion. There are so many other kinds of intrinsic and extrinsic calibration to be done programmatically (e.g radial lens distortion, projector roll etc) but I do not think I'll have the time for these minor calibration now.
0401
Spent the day revising my linear algebra.
I also came across an interesting gesture article called the "$1 Gesture Recogniser". I've tested the applet and its fairly accurate for such a small piece of code. It will be great if I can add it to Lightdraw package as a gesture recognition module.
0402
Finally I've implemented the calibration using homography matrices, cvSolve and cvFindHomography. At first I screwed up but then I've recalled that I've got to normalize my matrices, since one of my assumptions is that I work with z-component = 1.
To calibrate, all I need to do now is to identify four predetermined non-collinear vertices by selecting their location on the camera image. In future, this method can be automated by projecting recognizable patterns for the camera to pick up.
Now I've got really accurate laser to screen pixel mapping. Yeah!
0403
Following up from yesterday's work, I find that I can place the camera no further from where the projector is. The DVCam should be able to see an area bigger than the projected screen if I positioned the camera at a side, such that the camera would observe a trapezoid projection of the screen. However, the downside of this is that the laser point would lose intensity when observed an angle away from the screen perpendicular. This can be resolved by using a fairly brighter laser, as tested with my 50 mW green laser.
0404
Very soon I've got to develop a multiuser capable application since I am not able to find one that is both multiuser and aesthetically pleasing and yet serves a practical purpose.
In summary, I would have to deliver, sometime in May:
Finally, I've submitted my interim report today.
Here I am back to my blogging routine. I've just finished my interim report and handed it to Kevin for review. Meanwhile, I'm reading up on calibration references, primarily homography which should correct keystone distortion. There are so many other kinds of intrinsic and extrinsic calibration to be done programmatically (e.g radial lens distortion, projector roll etc) but I do not think I'll have the time for these minor calibration now.
0401
Spent the day revising my linear algebra.
I also came across an interesting gesture article called the "$1 Gesture Recogniser". I've tested the applet and its fairly accurate for such a small piece of code. It will be great if I can add it to Lightdraw package as a gesture recognition module.
0402
Finally I've implemented the calibration using homography matrices, cvSolve and cvFindHomography. At first I screwed up but then I've recalled that I've got to normalize my matrices, since one of my assumptions is that I work with z-component = 1.
To calibrate, all I need to do now is to identify four predetermined non-collinear vertices by selecting their location on the camera image. In future, this method can be automated by projecting recognizable patterns for the camera to pick up.
Now I've got really accurate laser to screen pixel mapping. Yeah!
0403
Following up from yesterday's work, I find that I can place the camera no further from where the projector is. The DVCam should be able to see an area bigger than the projected screen if I positioned the camera at a side, such that the camera would observe a trapezoid projection of the screen. However, the downside of this is that the laser point would lose intensity when observed an angle away from the screen perpendicular. This can be resolved by using a fairly brighter laser, as tested with my 50 mW green laser.
0404
Very soon I've got to develop a multiuser capable application since I am not able to find one that is both multiuser and aesthetically pleasing and yet serves a practical purpose.
In summary, I would have to deliver, sometime in May:
- A multiuser framework, such as OpenTouch (it is currently still under development). This is everything! I cannot use the current operating systems' single user interface, so I'll have to design my own. The design worries my abit as it is quite challenging.
- A single application, probably like a splashtop, with many widgets inside. Each widget is controlled by unique user, and multiple users can come together and control many widgets. These widgets will have to communicate with each other of course.
- Aesthetically pleasing application with nice transition effects.
- Has to have very responsive interface for it to be usable.
Finally, I've submitted my interim report today.
Subscribe to:
Posts (Atom)