Sunday, January 6, 2008

Week 00 - Because that's how programmers count.

0102

My First Day at Work

A Wednesday. I reported to work punctual despite the new year mood. (Hey I'm sober) I was introduced to Janice from HR, who assisted me in filling up a couple of lengthy documents (I've noticed that IHPC has more forms than usual and they are particularly long). There is one document which lists the schedule of shuttle bus services, and I bet it will come in handy. I was also given a student pass which has yet to be activated.

My Worker's Quarters

"The Cove" is what we call Lightdraw's HQ. And The Cove was used for meeting in the morning, so I chilled in the Library (which looks more like an aquarium), deleting Laptop's bookmarks, history, stored passwords temp files, system restore files, cookies etc. as I was supposed to give it to IT department for a scan.


My First Assignment

I received my first assignment from Kevin, Lightdraw's research officer. The Lightdraw team is to dismantle the Christmas tree. We rewarded ourselves by eating the Christmas Tree decorations (A choco gingerbreadman and a white pastry-slash-biscuit xmas tree shaped thingy). They looked somewhat edible. I am not in IHPC long enough to know whether they reuse their Christmas tree decorations or not. So Peng, my colleague from Temasek Polytechnic attachment, bravely took the first bite. Monkey see, monkey do. The ornaments didn't taste too bad actually.

Then minutes later, Kevin opened his pack of choco-chip cookies given by Lester to share. D-oh! Kevin also bought us T-shirts from Mauritius. He flew back to Singapore only yesterday.


My Leaking Air-con


Peng and JL
(surprisingly he has the same surname as me), also another intern from Temasek Poly, taught me how to fix the faulty air-con which, Peng says, "fails only after holidays (and holidays only), don't know why."

Here are the steps taught to me by Peng and JL:
  1. Take the make-shift funnel made out of a cut mineral bottle out of the first drawer of the cabinet the air-con is sitting on.
  2. Get ready a pail, stunned (army slang) from the pantry.
  3. Open a trapdoor located at the side of the air-con, and drain the water into the pail.
  4. Close the trapdoor.
  5. Pour the water into the beans compartment of the coffee machine which Kevin drinks from everyday.
  6. Pour the water away and return the pail to rightful owner.

On to serious matters. Peng and JL gave a demo and showcased their progress. Kevin, having just travelled back from Mauritius yesterday, suffered slightly from jetlag, but that did not stop him from giving us a briefing to put the team on track. We (including Cheng Teng aka CT) were given a brief update of Lightdraw's progress. Here are my current tasks, in general:
  • Laser pointer (depth) detection. (Read paper provided by Kevin.)
  • Stretching, possibly detect two pointers and stretch a generic primitive.
  • Mainly HCI, look for ways to generate events to OS.

SAB's coming over, so Kevin did a few trial runs to prep him self for the demo.

Harold joined us for lunch. Spent the rest of the day setting up Suse, Emacs, OpenCV, SVN and Lightdraw.


Note: The access-point in the Cove is Light. External IP of SVN repository noted in bible.



0103

My Excursion

Today the Lightdraw team went to the Visualization Centre (VC) in Temasek Polytechnic. Kevin hooked us up with Mr Tan, who was kind enough to explain the technology behind each of the setups in the VC. The first setup is more related to what we are designing. It projects image onto a mirror which reflects the image onto a piece of vertical glass that has a "special refractive holographic film which scatters light" (Ref: TouchLight: An Imaging Touch Screen and Disply for Gesture-Based Interaction). The image shows up clearer on the viewer's side of the glass. There are two infrared cameras to pick up infrared reflection from the user's hand, and therefore this enables multitouch interaction, like what you see from Minority Report. I've seen something similar before on Youtube, be sure to check out the series of videos from Johnny Chung Lee (HCII, Carnegie Mellon University) at http://www.cs.cmu.edu/~johnny/projects/wii/
The first video is analogous to the technology of VC's first setup. We have Wii back in the Cove, so Kevin suggested that we try out his setup as it looks pretty darn fun and interesting.

There is also a flat plasma display that supports 3D without headgears (stereoscopic) of any kind. The plasma is slightly fatter and the degree of vision for 3D is very limited, possibly 20 degrees by my estimate. Also, the viewer has to stand approx 1.5 metres away for the 3D effect to be noticeable, otherwise the image will look blurry, and that might irritate the viewer.

Next setup uses stereoscoping enabled by polarising glasses and polarising filters on projectors for three rear projections (Two projectors for each of three panels, both projectors use different polarisers). On screen is a simulation of a virtual MRT station. It is similar to setup seen in IHPC, except that it has a nifty handheld controller for X-Z plane panning, and a headgear mounted with ultrasonic transmitters to communicate with 10 (?) strips of ultrasonic receivers mounted on the ceiling (My secondary school ultrasonic project paid off). Disadvantages are obvious: Headgear is heavy and messes up my do, fast movements cannot be detected, significant lag, prolonged use may cause dizziness, requires low hanging ceiling for ultrasonic receivers to pick up signal accurately, headgear comes along with heavy battery pack, setup is huge and requires lots of space.


Last setup is for teleconferencing, It has a normal glass tilted at an angle to reflect images off a LCD to a viewer, as such the image can only be seen from the viewer and not behind the glass. This is important as there is a motorised camera behind the glass to capture the user behind the see-through glass. Our first teleconference? With a cleaner at an office in Aeon. This setup is more suited for single user only as there is only one keyboard, mouse. I would think quality of conference directly depends on how well camera operates and network latency. Camera looks very expensive as it as motors, wide angle lens and ability to zoom at the same time.

I heard the total setup costs 4.2M. You have to be very credible to get that kind of funding.

Once again, thank you Mr Tan! Learnt alot during this trip, and I got to experience advanced tecnology and equipments you dont see everyday.

Wise man says: Five makes bad number for long journeys.

We took bus service 10 all the way back to the Capricorn. The miserable journey is an hour plus. Slept most of it away. Makes one wonder how Peng takes this bus service to work everyday. I Kowtow to you man!


My OpenCV Installation on SuSe 10.3

Back at the Cove, I resumed setting up OpenCV. Finally got it up at the end of the day with CT's help.

Here's what needs to be done. For Suse 10.3.

Google "ffmpeg opencv". First search result:
http://www.comp.leeds.ac.uk/vision/opencv/install-lin-ffmpeg.html

Follow the instructions there with a few tweaks. Instead of installing to /home/ffmpeg/ as the guide says, install it to /usr/local/. Change the paths respectively in ./configure commands.

RTFM! Read all INSTALL and README included in the packages, and the ./configure output as well. Also make sure you have the devels, image libs and GTK+ 2.x which does not come with SuSe by default. For GTK+ 2.x, it is called "gtk2" in Yast, so search for "gtk2" instead.


This is my config from OpenCV ./configure which you will execute later on.
Now may be a good time to download the devels of the packages marked "yes".

Windowing system --------------
Use Carbon / Mac OS X: no
Use gtk+ 2.x: yes
Use gthread: yes

Image I/O ---------------------
Use libjpeg: yes
Use zlib: yes
Use libpng: yes
Use libtiff: yes
Use libjasper: yes
Use libIlmImf: no

Video I/O ---------------------
Use QuickTime / Mac OS X: no
Use xine: no
Use ffmpeg: no
Use dc1394 & raw1394: yes
Use v4l: yes
Use v4l2: yes



Also, only the "make install" commands requires root privileges. You do not need root access for ./configure despite what the guide says.

CT pointed out an essential fix to ffmpeg avcodec.h (Thanks CT!): Patch avcodec.h in /usr/local/include/libavcodec and other places it may be found or used:
Replace:
#define AV_NOPTS_VALUE INT64_C (0x8 0000 0000 0000 000)
with
#define AV_NOPTS_VALUE 0x8 0000 0000 0000 000LL

The line is found at first page of the code.


Afterwhich, opencv may still refuse to compile with ffmpeg. It didnt compile for me. Solution? Dr Voicu from CS3212 would say, "accept it!". That is one of the more common solution to problems in CS. So, take out ffmpeg from opencv ./configure, and mine looked something like this:

./configure --enable-apps --enable-shared --without-ffmpeg --with-gnu-ld --with-x --without-quicktime CXXFLAGS=-fno-strict-aliasing
That should be all there is to compiling opencv. Next, getting my ancient Logitech Quickcam IM to work. Perhaps I should invest in a Logitech Quickcam Sphere? Sponsors? :)



0104

My Webcam

I managed to setup my webcam the night before. Found good sources for Logitech linux drivers:
http://qce-ga.sourceforge.net/
http://www.quickcamteam.net/hcl/linux/logitech-webcams
(go here for Quickcam IM)

Note: Quickcam IM uses spca drivers, not uvcvideo.

For the quickcam drivers, you will also need xawtv, v4l, SDL etc.
I got the SDL off SuSe repository through Yast, wasnt able to compile either of the downloaded rpm and tar file.

spcaview works like magic.

Brought my camera to office, managed to test it with samples included in openCV with Kevin's help. Change access to opencv/samples/build_all.sh to executable, run it to compile the samples.

"motempl" is a good sample to use to test your webcam. "edge" too.

Next, I managed to compile lightdraw codes.

For some reason I've yet to uncover, only "alpha", "edgeBlending" and "trail_v2" is responsive enough. Refer to bible for complete list. Think I'll debug this over the weekend. Also, the codes are half written in C++, so I will rewrite a version in CPP anyway.

Todo:
  • Find out why some programs are unresponsive on my laptop, probably use my desktop as control.
  • Find out why I cannot connect to external IP of Lightdraw repository
  • Read paper
  • Read openCV wiki and manuals.
    http://www.cs.iit.edu/~agam/cs512/lect-notes/opencv-intro/opencv-intro.html looks good.
  • Get laser point detection up.
  • Try to borrow a SATA/External DVD reader off colleagues.
  • Try out Johnny Lee's Wii codes
  • Contact my supervisor from SoC, Dr TEO Yong Meng

No comments: