Tuesday, March 31, 2009, 10:55 PM - code
Here's a little standalone utility I put together for a couple of students while teaching at the Medialab Prado. I'm posting it here in the event someone else finds it useful..
CamshiftOSC adds network functionality to OpenCV's Camshift demo. It allows you to interactively select a region of interest within a live video stream and send the center and relative angle of that region to OSC clients (Pure Data, Blender, Processing etc) quickly and simply. In the workshop it was used for tracking the tops of people's heads but any distinct clump of pixels (an LED, a flame, a cat) can also be used.
Start it like so:
./camshiftOSC <camera index> <IP> <port>
So, if you wanted to capture from /dev/video1 and send the center of a tracked area to port 4950 to a computer 193.2.132.73 on the internet, you'd:
./camshiftOSC 1 193.2.132.73 4950
Use 127.0.0.1 if you want to send to a client on the same host.
It should compile on any Linux system with liblo and opencv installed.
To get up and running on Ubuntu or Debian systems:
sudo apt-get install libcv-dev liblo-dev
Compile it like so:
gcc -lcv -lcvaux -lhighgui -llo -I/usr/include/opencv -I/usr/include/lo camshiftOSC.c -o camshiftOSC
Be sure to play with the sliders ('VMax' especially) to get the best results.
Cheers
| permalink
Saturday, March 21, 2009, 11:46 PM - code
I notice that lots of people are looking for a simple and portable example of how to capture and display video from a webcam. I've written up an example in around 70 lines of code using OpenCV and C++ and posted it here in the interest that it may be useful. Because it uses OpenCV you can also use it as a capture skeleton for a computer vision application.
I've tested it on a GNU/Linux system but it should compile fine on OS\X. See the comments for how to compile and use..
Hope that helps!
Sunday, December 16, 2007, 03:31 PM - code
I recently spent some time looking around the hinternets for a simple method to stream live video, captured using OpenCV, from a webcam or firewire camera, to textures on one or more OpenGL polygons, windowed with something light like GLUT. Having found nothing that acheives this, and seeing that lots of people were trying, I wrote a program in C that does.
Why OpenCV? OpenCV offers advanced texture processing and analysis: being able to find natural features in images on OpenGL surfaces offers up many interesting possibilities.
The trick was just to pass correctly scaled (power of 2), captured IplImage data to glTexSubImage2D every frame. It needed to be correctly formatted and bound beforehand.
Get the source code here, licensed under the GPLv3. It will compile on a Linux system. OpenCV, FreeGlut and OpenGL are needed. You'll need hardware accelerated 3D too..
Enjoy!
Tuesday, July 31, 2007, 11:37 AM - code
Dilemma: Hotel in foreign country and must wake up very early. Phone critically low on battery, charger missing, hotelier appears to be asleep and no alarm clock in sight. Very tired, reasonably inebriated.
Fix: Write a script that emulates the sound of my phone's alarm before passing out:
# simple alarm script.
# requires the program 'beep'
# turn up your PC speaker volume and use as follows:
# 'python alarm.py HH:MM'
import time
import sys
import os
wakeTime = sys.argv[1].split(':')
while 1:
time.sleep(1)
if time.localtime()[3] >= int(wakeTime[0]):
if time.localtime()[4] >= int(wakeTime[1]):
os.popen('beep -l 40 -f 2000 -n -l 40 -f 10000 -n -l 40 -f 2000')
Saturday, March 3, 2007, 09:30 PM - code
While here at Georgia Tech I'm giving a class on the development of 'expressive games', and for the purpose I chose Nintendo Wiimotes as the control context for class designs. The final projects will be produced in Blender, using the Blender game engine.
Only having Windows machines at my disposal I wrote a basic Python script that exposes acceleromoter, tilt and button events from GlovePIE over Open Sound Control (which is natively supported by that application) to the realtime engine of Blender. I decided to go this way rather that create a bluetooth interface inside Blender for two reasons: GlovePIE is a great environment for building useful control models from raw input, it supports the network capable protocol OSC and I wanted to keep input-server like code out of Blender (for reasons you'd understand if you used Python in Blender).
GlovePIE however is more than I need on Linux alongwith the fact I don't have a Windows machine near me most of the time. I looked into various options for getting control data from a few different 'drivers' out over OSC and into Blender. Preferring to work in Python, I tried WMD but found it too awkward to develop with, although it is nothing short of comprehensive. I finally settled on the very neatly written (Linux only) libwiimote and wrote a simple little application in C to provide what I need. Here it is, wiiOSC.
To run it on your system you'll need libwiimote, Steve Harris's lightweight OSC implementation liblo, a bluetooth dongle (of course) and a bluetooth support in your kernel (most modern distro's support popular bt dongles out-of-the-box). wiiOSC will send everything libwiimote supports (IR, accelerometer, tilt, button events etc) to any computer you specify, whether to 127.0.0.1 or a machine on the internet.
wiiOSC is invoked as follows:
wiiOSC MAC remote_host port
For instance, to send wiimote data to a machine with the IP 192.168.1.102 on port 4950, I:
wiiOSC 00:19:1D:2C:31:E1 192.168.1.102 4950
To get the MAC addr of your wiimote, just use
hcitool scan.
I use Blender as my listener context but you can pickup the wiimote data in any application that supports it of course, PureData, Veejay etc. To use Blender as your listener you'll need Wiretap's Python OSC implementation and this Blender file.Enjoy.
Tuesday, February 6, 2007, 02:00 AM - code
I've picked out the packet capture part of PG and turned it into a reasonably useful and lightweight logger that should run on any UNIX system (tested on Linux). Packet length, remote IP, transaction direction, Country Code and port are all logged. Packet lengths are added over time, so you see an accumulation of traffic per IP.
Use (as root):
./pcap_collate <DEVICE> <PATH>This script will capture, log and collate TCP and UDP packets going over <DEVICE> (eth0, eth1 etc). the <PATH> argument sets the location the resulting GZIPped log will be written to, which will be updated every 1000 packets.
For this reason the script will automatically generate a new log on a new day and can be restarted at any time without losing more than 1000 packets of traffic.
The log is a dump of the dict containing comma separated fields structured as follows:
IP, direction, port, geo, lengthIt will filter out all the packets on the local network, and so is intended for use in recording Internet traffic going over a single host.
Ports to be filtered for can be set in the file config/filter.config
Stop capture with the script 'stop_capture'.
Get it here. Unpack and see the file README.txt.
Thursday, February 2, 2006, 10:11 PM - code
Some fun to be had. To start with try this:
import ossaudiodev
from ossaudiodev import AFMT_S16_LE
from Numeric import *
dsp = ossaudiodev.open('w')
dsp.setfmt(AFMT_S16_LE)
dsp.channels(2)
dsp.speed(22050)
i = 0
x = raw_input("length: ")
x = int(x)
a = arange(x)
while 1:
# between 200 and 600 is good
while i < x:
i = i +1
b = a[i:]
dsp.writeall(str(b))
print i,":",x
else:
while i !=0:
i = i -1
b = a[i:]
dsp.writeall(str(b))
print i,":",x
Thursday, February 2, 2006, 10:10 PM - code
it's harder than you think!
here it is in python ported from some Java i found online.
Thursday, February 2, 2006, 10:10 PM - code
here's a wifi access point brower i wrote in python for Linux users that prefer to use console applications. i
'll get around to one that roams and pumps based on the best offer.
scent.py
Thursday, February 2, 2006, 10:09 PM - code
here's a little python script to let you know when your laptop battery is running low.
dacpi.py




