tag:blogger.com,1999:blog-17854737968651193432024-03-16T00:08:19.785-07:00KeyboardmodsInput devices and custom keyboardstrtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.comBlogger31125tag:blogger.com,1999:blog-1785473796865119343.post-36011588069422977882013-03-27T22:44:00.001-07:002013-03-28T01:14:21.778-07:00Leap Motion in Gentoo LinuxGreat news for those of you with Leap Motion developer kits, today's 0.7.6 release of the SDK adds linux support. Their package targets ubuntu specifically but I was able to get it working on a gentoo system as well. Here's a preliminary write up of what I did:
<ol>
<li>Download the 0.7.6 SDK (if you're a Leap developer you know how to get it)</li>
<li>move the developer kit tarball to a convenient temp directory</li>
<li>tar xvzf Leap_Developer_Kit_0.7.6_3012_Linux.tar.gz</li>
this will extract the debian packages- for gentoo we can convert to tarball in the next step using alien.
<li>alien -t Leap-0.7.6-x64.deb</li>
Note: if you're on a 32-bit system use Leap-0.7.6-x86.deb
<li> tar xvzf leap-0.7.6.tgz </li>
<li> the previous step extracted usr and etc directories so now we copy the files from these directories into place:<br>
<code>sudo cp -ir usr/local/* /usr/local/</code><br>
Note that I use -i flag to avoid accidentally overwriting existing files. I wound up just copying all the files into place rather than just the leap-specific ones because the Leap installer has bundled versions of QT libraries and trying to use my system QT libraries was causing a segfault.
</li>
<li><code>sudo cp -ir etc/udev/rules.d/* /etc/udev/rules.d/</code>
<br>
Copying those udev rules should in theory allow you to run the Leap application without superuser privileges, but that did not work for me- I will try to figure this out later.
</li>
I'm not sure if these next 3 steps are necessary on gentoo, but just in case- they create the plugdev group if it does not exist, add you to that group, and then reload your session so your group membership is updated without having to logout/login.
<li><code>groupadd plugdev</code></li>
<li><code>sudo usermod -a -G plugdev $USER</code></li>
<li><code>exec su-l $USER</code> #refresh group membership </li>
<li><code>sudo Leap</code></li>
<br>
Assuming that /usr/local/bin/ is in your path- you should just be able to run the Leap command in your shell. You need the Leap application running in the background for any other interactions with the leap device to work.
Notice that I had to run the command as root even though I copied in the udev rules. In my case, running LeapPipeline(ostensibly the command line alternative to Leap) without root privileges will give an error like:
<code>[00:54:08] [Critical] [EC_NO_DEVICE] Please make sure your Leap device is plugged in.</code>
<br> Running LeapPipeline <b>with</b> root privileges does not give me any error message, just some debug output and then it quits without doing anything.
Running Leap without without root privileges in my case will not give any error messages- everything will simply fail silently, i.e. if you run the visualizer you will get no tracking data.
<li>
<code>Visualizer</code>
<br>
Test the cool new gesture features with the visualizer by running the Visualizer command and pressing the letter 'o' to enable drawing gestures. In this screenshot I have traced out a counter-clockwise circle with my finger- which is one of the supported gestures so it is drawn as light blue circular arrow. You can do multi-finger swipes, taps, and circles. See this part of the documentation for more details: <a href=" https://developer.leapmotion.com/documentation/guide/Leap_Overview#taps"></a>
<a href="http://1.bp.blogspot.com/-yCLTzpDwrF4/UVP6uGoNsnI/AAAAAAAAATk/Qwdter-5ygc/s1600/2013-03-28-010907_1280x800_scrot.png" imageanchor="1" ><img border="0" src="http://1.bp.blogspot.com/-yCLTzpDwrF4/UVP6uGoNsnI/AAAAAAAAATk/Qwdter-5ygc/s320/2013-03-28-010907_1280x800_scrot.png" /></a>
</li>
</ol>
<h3>Play with the samples</h3>
At the top level of the directory where you unpacked Leap_Developer_Kit_0.7.6_3012_Linux.tar.gz you will see an examples directory. Here is what they look like.
<h4>FingerVisualizer</h4>
<a href="http://3.bp.blogspot.com/-42bVaD2RLm0/UVPV3jUYonI/AAAAAAAAATE/tlNNDuAxWsg/s1600/2013-03-27-223053_1026x796_scrot.png" imageanchor="1" ><img border="0" src="http://3.bp.blogspot.com/-42bVaD2RLm0/UVPV3jUYonI/AAAAAAAAATE/tlNNDuAxWsg/s320/2013-03-27-223053_1026x796_scrot.png" /></a>
<h4>MotionVisualizer</h4>
<a href="http://4.bp.blogspot.com/-ljt6jjeXAyw/UVPXQ7tapHI/AAAAAAAAATM/uxmtFE4zg9Y/s1600/2013-03-27-223404_1026x796_scrot.png" imageanchor="1" ><img border="0" src="http://4.bp.blogspot.com/-ljt6jjeXAyw/UVPXQ7tapHI/AAAAAAAAATM/uxmtFE4zg9Y/s320/2013-03-27-223404_1026x796_scrot.png" /></a>
<h4>UnitySandbox</h4>
<a href="http://4.bp.blogspot.com/-utQ5URCmnRg/UVPXZ5LJh0I/AAAAAAAAATU/gAxBGftwMqk/s1600/2013-03-27-223637_1024x768_scrot.png" imageanchor="1" ><img border="0" src="http://4.bp.blogspot.com/-utQ5URCmnRg/UVPXZ5LJh0I/AAAAAAAAATU/gAxBGftwMqk/s320/2013-03-27-223637_1024x768_scrot.png" /></a>
<h3>Some caveats</h3>
At some point a .Leap Motion directory needs to be created in your home directory (note the space between the two words and the leading "."). I'm not sure if this is done by the installer or by the first run of the Leap application. In my case I actually copied the contents of the .Leap Motion folder on my ubuntu system to my gentoo system. In particular, there should be two files in this folder a leapid file and a file that is named something like DEV-12345 (yours will be different). The Leap application is actually a GUI app which resides in your dock/panel, if you use one, but it's not necessary- Leap will run without a panel. I've confirmed that it works with xfce4-panel if you want to see it, but you can access almost all the executables directly from the commandline anyway:
<ul>
<li>/usr/local/bin/Visualizer</li>
<li>/usr/local/bin/ScreenLocator</li>
<li>/usr/local/bin/Recalibrate</li>
<li>/usr/local/bin/LeapPipeline</li>
<li>/usr/local/bin/Leap</li>
</ul>
Post questions if you have any problems.trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com22tag:blogger.com,1999:blog-1785473796865119343.post-36831318794686144722013-01-04T04:16:00.000-08:002013-01-04T04:16:35.362-08:00TI MSP430 Launchpad in gentoo linuxGetting started with the MSP430 launchpad in gentoo is pretty straightforward, once you've dug through some documentation. Here's some instructions that should get you up and running.
<ol>
<li><code>emerge crossdev</code></li><br>
<b>NOTE:</b> <u>Only perform steps 2 and 3 if you do not already have an overlay setup</u><br>
<br>
<li><code>mkdir /usr/local/portage</code></li><br>
<li><code>add PORTDIR_OVERLAY="/usr/local/portage" to /etc/make.conf</code></li><br>
<b>NOTE:</b> <u>Only perform steps 4-6 if /etc/portage/package.env is a file rather than a directory on your system.</u> Also note that the string "x86_64-pc-linux-gnu" needs to match the architecture of your system, which you can find by running <code>gcc -v</code> and checking the string labeled "Target"
<br><br>
<li><code>mv /etc/portage/package.env /etc/portage/package.env.bak</code></li><br>
<li><code>mkdir /etc/portage/package.env</code></li><br>
<li><code>mv /etc/portage/package.env.bak /etc/portage/package.env/x86_64-pc-linux-gnu</code></li><br>
<li><code>crossdev -s4 -t msp430 </code></li><br>
-s4 means stage 4 which will build a full gcc, libc,kernel header, and binutils
This command will also create /usr/msp430/etc/portage/make.conf which I believe will control how any future msp430 specific packages get emerged.<br><br>
<li><code>crossdev --ex-only --ex-gdb -t msp430 </code></li><br>
Now that you have got gcc going, this command will build just gdb without redoing all the other 4 stages(libc,gcc,etc.)<br><br>
<li><code>emerge mspdebug</code></li><br>
<li><code>msp430-gcc -mmcu=msp430g2553 blink.c</code></li><br>
<b>Scroll down to the end of this post to find the source code for blink.c</b>
Note that my launchpad came prepopulated with a msp430g2553 in the DIP socket, you will want to check what yours came with and set the -mmcu parameter accordingly. The model number is printed on the top of the microcontroller. <br><br>
<li><code>sudo mspdebug rf2500</code></li><br>
This command will start the debugger which will spew out a bunch of output before displaying a prompt that looks like this:<br>
<code>
(mspdebug)<br><br>
</code>
<li><code>(mspdebug) <b>prog a.out</b></code></li>
Note that you only type the text in bold in the command above, (mspdebug) is just the prompt that is displayed on screen.
<br>
<li><code>hit Ctrl+D to quit the debugger</code></li>
The microcontroller will start executing code, in this case blinking LED1,
LED2 is attached to P1.6
</ol>
<script src="https://gist.github.com/4451982.js"></script>trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com22tag:blogger.com,1999:blog-1785473796865119343.post-59557905306691862192012-12-30T23:44:00.000-08:002012-12-31T01:35:56.739-08:00Wiimote+nunchuk web surfing in linuxAt this point, messing around with the wiimote probably seems dated, but hey, I have one lying around with a motionplus and nunchuk. The final outcome of this post if you follow it- you'll be able to control your mouse cursor with the nunchuk analog stick, scroll up/down/left/right with the wiimote d-pad and left/right click with either the wiimote or the nunchuk. Let's get started.
<br><br>
Prerequisites:
<ul>
<li>A wiimote</li>
<li>A wiimote nunchuk</li>
<li>A linux system with the uinput kernel module</li>
</ul>
Procedure:
<ol>
<li>Clone my fork of cwiid into a directory of your choice(wiimouse for example here)<br>
<code>git clone https://github.com/trtg/cwiid.git wiimouse</code>
</li>
<li>
<code>cd wiimouse</code>
</li>
<li><code>aclocal</code></li>
<li><code>autoconf</code></li>
<li><code>./configure --with-cwiid-config-dir=/etc/cwiid/</code></li>
<li><code>make</code></li>
<li><code>sudo make install</code><br><br>
Note that by default this will install everything using a prefix of /usr/local unless you pass --prefix=some_other_directory to configure so make sure whatever prefix you use is in your path.
</li>
<li><code>sudo modprobe uinput</code></li>
<li>Create a new config file for the plugin I wrote by creating the file /etc/cwiid/nunchuk_stick_ptr and adding the text below to it<br>
<code>
include buttons<br>
Plugin.nunchuk_stick_ptr.X = REL_X<br>
Plugin.nunchuk_stick_ptr.Y = REL_Y<br>
</code>
</li>
<li><code>sudo wminput -c nunchuk_stick_ptr</code>
You will see this message appear:
<code>Put Wiimote in discoverable mode now (press 1+2)...</code>
Do as it says and press buttons 1 and 2 on the wiimote then release them, the LEDs on the wiimote will blink for a while then the terminal where you entered the previous command will say:
<code>Ready.</code>
</li>
<li>
Now try moving the nunchuk analog stick and it should move your cursor around.
Adjust /etc/cwiid/wminput/buttons if you want to change button mappings
</li>
</ol>
<b>Note:</b> if you do not want to have to run wminput as root, you can configure things so that uinput may be accessed by users in the wheel users group. To make this happen, create a udev rules file in /etc/udev/rules.d/ with a name of your choosing, for example /etc/udev/rules.d/uinput.rules
which contains the following line of text:<br><br>
<code>KERNEL=="uinput",GROUP="wheel",MODE="0660"</code><br><br>
If you've already loaded the uinput kernel module prior to creating this file, you'll have to unload the module:<br>
<code>sudo rmmod uinput</code><br>
and then reload it:<br>
<code>sudo modprobe uinput</code><br>
Now you should be able to run wminput without sudo (assuming your user is in the wheel group)
trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com10tag:blogger.com,1999:blog-1785473796865119343.post-65423252309084561462012-06-15T18:58:00.000-07:002012-06-15T19:06:11.453-07:00Fat Secret calorie counter access with pythonThis post complements my earlier discussion of the Withings scale API here:
<br>
<a href="http://www.keyboardmods.com/2012/05/withings-wifi-scale-python-dashboard.html"> http://www.keyboardmods.com/2012/05/withings-wifi-scale-python-dashboard.html</a>
<br>
I've written a convenience library in python that simplifies the process of retrieving protein, carbohydrate, fat, and calorie data from Fatsecret along with several other features from the REST API. To make use of the library you'll need to register as a developer at the link below:
<br/>
<a href="http://platform.fatsecret.com/api/Default.aspx?screen=r">http://platform.fatsecret.com/api/Default.aspx?screen=r</a>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://4.bp.blogspot.com/-_DQNRh7npZY/T7nBoUKnn4I/AAAAAAAAAI0/z7c_R6w73hA/s1600/fatsecret_registration.png" imageanchor="1" style="margin-left:1em; margin-right:1em"><img border="0" height="368" width="400" src="http://4.bp.blogspot.com/-_DQNRh7npZY/T7nBoUKnn4I/AAAAAAAAAI0/z7c_R6w73hA/s400/fatsecret_registration.png" /></a></div>
After registering you will be given a consumer/API key and secret which you will need to use my python library. Once you have your key and secret, download the python library with the following command:
<br><br>
<code> git clone https://github.com/trtg/pyfatsecret.git</code>
<br><br>
The library comes with the example below that demonstrates its usage.
To use the example, replace the strings 'your_key_goes_here' and 'your_secret_goes_here' with the key and secret you were assigned after registering as a developer.
<br>
<script src="http://gist-it.appspot.com/github/trtg/pyfatsecret/raw/master/example.py"></script>
Note that the example above makes use of pandas to plot timeseries. You can get the latest pandas with this command:
<br><br>
<code>git clone https://github.com/pydata/pandas.git</code>
<br><br>
I've tested the example above with the stable 0.7.3 release as well.
<br>
Once you paste your key and secret into place and install pandas you can run the example as follows:
<br><br>
<code>python example.py</code>
<br><br>
Assuming you already have some data recorded with fatsecret, you should get a plot like the one below after running (click on the image to see a larger copy):
<br>
<a href="http://4.bp.blogspot.com/-nn6XWqG8HLk/T9vnmU2Nf7I/AAAAAAAAANg/WvPmZ9BV-b8/s1600/fatsecret_figure.png" imageanchor="1" style="margin-left:1em; margin-right:1em"><img border="0" height="318" width="400" src="http://4.bp.blogspot.com/-nn6XWqG8HLk/T9vnmU2Nf7I/AAAAAAAAANg/WvPmZ9BV-b8/s400/fatsecret_figure.png" /></a>trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com7tag:blogger.com,1999:blog-1785473796865119343.post-87014866464938480342012-05-21T00:28:00.000-07:002012-06-15T18:23:46.368-07:00Withings wifi scale python dashboard using matplotlib and pandasThe Withings wifi scale is a great bit of hardware, much more visually appealing than <a href="http://www.keyboardmods.com/2010/05/bluetooth-wireless-bathroom-scale-with.html">my handmade equivalent</a> and wifi makes it standalone, also a big advantage over my bluetooth scale. I've been using the scale to log my weight and using fatsecret to track caloric intake for a while now and decided it would be nice to make a dashboard integrating data from both sources. The plots on Withings' website leave much to be desired. Thankfully, both Withings and fatsecret have APIs which make third party apps possible. Below I'll show some code that uses the rauth library to access both services. rauth provides OAuth 1.0/a, 2.0, and Ofly consumer support. You can install rauth using easy_install:
<br/>
<code>sudo easy_install rauth</code>
<br/>
Before using the Withings and fatsecret APIs, you'll need to register as a developer, you can do that by following the links below:
<br/>
Withings developer registration:
<br/>
<a href="https://oauth.withings.com/en/partner/add">https://oauth.withings.com/en/partner/add </a>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://3.bp.blogspot.com/-AxQ96-ptP0U/T7nAMZQy5qI/AAAAAAAAAIo/AfphUyI8B3U/s1600/withings_registration.png" imageanchor="1" style="margin-left:1em; margin-right:1em"><img border="0" height="400" width="336" src="http://3.bp.blogspot.com/-AxQ96-ptP0U/T7nAMZQy5qI/AAAAAAAAAIo/AfphUyI8B3U/s400/withings_registration.png" /></a></div>
If you just want to extend the code I provide further down you can fill an arbitrary value for the "Application Website" and "Organization" fields and leave the Callback URL blank. Otherwise, fill in those fields according to your needs.
Once you've registered as a developer with both services you will have your Consumer Key and Shared Secret which you will need to send API requests.
Now grab my python code from github:
<br><br>
<code>git clone https://github.com/trtg/pywithings.git</code>
<br><br>
The git repo has two files: withings.py(the actual library itself) and example.py (seen below). Open up example.py and fill in your consumer key, consumer secret, and the email address you used to register with Withings as indicated. Using the library is pretty straightforward: instantiate a withings object and then just call get_weights() to retrieve a list of weight measurements and an associated list of dates (in seconds since the epoch format) which you can then use however you'd like.
<script src="https://gist.github.com/2862723.js"> </script>
Note that the very first time you run example.py you will be prompted to authorize the app to access your withings account by pasting a URL into your browser. Once you go to that URL and login, you will see a screen like this:
<div class="separator" style="clear: both; text-align: center;">
<a href="http://2.bp.blogspot.com/-uiiNbdS8H-M/T8u03sTSF2I/AAAAAAAAAMo/GL-uuOeOmdM/s1600/authorization_prompt.png" imageanchor="1" style="margin-left:1em; margin-right:1em"><img border="0" height="171" width="400" src="http://2.bp.blogspot.com/-uiiNbdS8H-M/T8u03sTSF2I/AAAAAAAAAMo/GL-uuOeOmdM/s400/authorization_prompt.png" /></a></div>
After you click "allow" you will see an oauth token string as shown below. Copy just the part after "oauth_verifier=" , in this case PZBDNyyuxDnGkeMxccY.
<div class="separator" style="clear: both; text-align: center;">
<a href="http://4.bp.blogspot.com/-UWnjd3TvtW0/T8u2OVNViVI/AAAAAAAAAM0/uGDSmLIQGHw/s1600/access_granted.png" imageanchor="1" style="margin-left:1em; margin-right:1em"><img border="0" height="110" width="400" src="http://4.bp.blogspot.com/-UWnjd3TvtW0/T8u2OVNViVI/AAAAAAAAAM0/uGDSmLIQGHw/s400/access_granted.png" /></a></div>
Paste the "PZ..." string into your terminal where prompted, hit enter, and you should get a graph like the one below.(Click on the graph to see a larger version) I used the python package pandas to treat the weight data as a time series and get nice date labels on the x-axis as well as to plot the smooth red rolling average of the data.
<div class="separator" style="clear: both; text-align: center;">
<a href="http://2.bp.blogspot.com/-h0B7GbycKyk/T9MmI1JJTJI/AAAAAAAAANE/vQVnFYnLt2E/s1600/pandas_withings_plot.png" imageanchor="1" style="margin-left:1em; margin-right:1em"><img border="0" height="207" width="400" src="http://2.bp.blogspot.com/-h0B7GbycKyk/T9MmI1JJTJI/AAAAAAAAANE/vQVnFYnLt2E/s400/pandas_withings_plot.png" /></a></div>
In an upcoming post, I'll discuss similar code that retrieves data from fatsecret to generate plots like the one below:
<div class="separator" style="clear: both; text-align: center;">
<a href="http://3.bp.blogspot.com/-XITMQOuNA7U/T7oZ_eJJJqI/AAAAAAAAAJE/p4FBfkXt4oc/s1600/fatsecret_graph.png" imageanchor="1" style="margin-left:1em; margin-right:1em"><img border="0" height="264" width="400" src="http://3.bp.blogspot.com/-XITMQOuNA7U/T7oZ_eJJJqI/AAAAAAAAAJE/p4FBfkXt4oc/s400/fatsecret_graph.png" /></a></div>trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com6tag:blogger.com,1999:blog-1785473796865119343.post-76650539215709142002012-01-11T13:17:00.000-08:002012-01-13T10:51:54.331-08:00Gymnastics tap swing trainerTiming is one of the things I've found most difficult to pick up while learning basic gymnastics. This is especially true when it comes to high bar/strap bar giants and tap swings. If I just blindly practice, I tend to tap far too early. For back giants, the tap is the moment when you transition from a hollow to an arch then back to a hollow again. To facilitate switching from hollow to arch at the right time I built the device you see in the video below. I suspect it would also work well for kip timing (it would tell you when to bring toes to the bar). It's basically a beam break sensor, like you might have at the bottom of your garage door, except it runs on batteries and beeps whenever something crosses between the flashlight and sensor. The idea is that you place the flashlight and sensor on either side of the high-bar at the location where you should ideally tap and then as you swing you just wait to hear the beep before switching from hollow to arch.<br /><br /><iframe width="480" height="385" src="http://www.youtube.com/embed/NZrG_yEOfIo" frameborder="0" allowfullscreen></iframe><br /><br />Here you can see and hear the tap trainer in action- I know my form is terrible, the video is just to show you how the device works. That high pitched beep you hear every time I approach maybe 30 degrees away from vertical is the sensor detecting me crossing its path and then beeping. <br /><br /><iframe width="480" height="385" src="http://www.youtube.com/embed/OmLuwBhNhIY" frameborder="0" allowfullscreen></iframe>trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com7tag:blogger.com,1999:blog-1785473796865119343.post-34515672350854025262011-10-17T11:56:00.000-07:002011-10-31T16:57:04.844-07:00Kinect speech recognition in linuxAudio support is now part of libfreenect. Additionally it is now possible to load the microsoft SDK version of the audio firmware from linux courtesy of a utility called kinect_upload_fw written by Antonio Ospite.This version of the firmware makes the kinect appear to your computer as a standard USB microphone. <br />This means you can now record audio using your kinect, but that's not all that interesting in and of itself. Linux support for speech recognition at this point is not all that great. It is possible to run dragon naturallyspeaking via wine or to use the sphinx project (after much training), but neither of those approaches really appealed to me for simple voice commands (as opposed to dictation). The google android project happens to include a speech recognizer from Nuance which by default is meant to be built for an ARM target, like your phone. After extensive hacking around the build system I was able to instead build for an x86 target, like your desktop. Now, you can combine these two things- kinect array microphone + android voice recognition to do some more interesting things, i.e. toggle hand tracking on and off via voice.<br /><br /><br /><br /><span style="font-weight:bold;">How to get started:</span><br /><br />1) Check if you have the "unbuffer" application which is part of the linux scripting language called expect:<br /><br /><code>which unbuffer</code><br /><br />If the above command comes up empty you should download a copy of unbuffer from the link here:<br /><a href="http://dl.dropbox.com/u/11217419/unbuffer">http://dl.dropbox.com/u/11217419/unbuffer</a><br /><br />copy unbuffer to a directory that is in your path, like /usr/local/bin or ~/bin<br /><br />2)Download my precompiled version of the srec subproject from here:<br /><a href="http://dl.dropbox.com/u/11217419/srec_kinect.tgz">http://dl.dropbox.com/u/11217419/srec_kinect.tgz </a><br /><br />3)save the tarball from step 1 in a convenient directory then unpack it with this command:<br /><code>tar xfz srec_kinect.tgz</code><br /><br />4)switch into the subdirectory where I've placed some convenience scripts:<br /><code>cd srec/config/en.us</code><br /><br />5) Open a second terminal and in that second terminal also switch into srec/config/en.us<br /><br />6) In the first terminal execute <br /><code>./run_SRecTestAudio.sh</code> <br />and in the other terminal execute<br /><code> cat speech_fifo</code><br /><br />7) try speaking into your microphone and wait for recognition results to appear in both terminals. Note that the vocabulary as configured at this point is very small- words like up,down,left,right and the numbers from 1-9 should be recognized properly.<br /><br /><span style="font-weight:bold;">Integrating the kinect:</span><br />1)Acquire Antonio Ospite's firmware tools like so:<br /><code>git clone http://git.ao2.it/kinect-audio-setup.git/ </code><br /><br />2)move into the kinect-audio-setup subdirectory:<br /><code>cd kinect-audio-setup</code><br /><br />3)build kinect_upload_fw as root:<br /><code>make install</code><br /><br />4)Fetch and extract the microsoft kinect SDK audio firmware (depending on your directory permissions, this may also need to be run as root):<br /><code>./kinect_fetch_fw /lib/firmware/kinect</code><br /><br />This will extract the firmware to this location by default:<br /><code>/lib/firmware/kinect/UACFirmware.C9C6E852_35A3_41DC_A57D_BDDEB43DFD04 </code><br /><br />5)Upload the newly extracted firmware to the kinect:<br /><code>kinect_upload_fw /lib/firmware/kinect/UACFirmware.C9C6E852_35A3_41DC_A57D_BDDEB43DFD04</code><br /><br />6)Check for a new USB audio device in your dmesg output<br /><br />7)Configure the kinect USB audio device to be your primary microphone input and <br />try out run_SRecTestAudio.sh again as described earlier. <br /><br /><br /><span style="font-weight:bold;">Additional Notes:</span><br /><br />I unfortunately no longer remember all the changes I had to make in order for the srec project within android build for x86. Perhaps someone with better knowledge of the android build system can chime in at the comments below. In the interim, use the precompiled copy that I have linked above, just be aware that it is old, I think it dates back to the froyo branch of android or earlier (I compiled it a long time ago). If you want to take a shot at building the latest srec yourself, check out the android source code then look under external/srec/<br /><br />The run_SRecTestAudio.sh script sets up the speech recognizer to run on live audio and pipes the recognition results to a fifo in the same directory called speech_fifo. Running cat in the second terminal lets you read out the recognition results as they arrive. Instead of cat you could alternatively have whatever programs needs recognition results read from the fifo and act accordingly. Unbuffer is used to make sure you see recognition results right away rather than waiting for the speech_fifo to fill up. <br /><br />The srec recognizer does not require any training but has certain limitations. The most significant limitation is the vocabulary it can recognize. The larger the vocabulary you specify, the less accurate the recognition results will likely be. As a result this recognizer is best used for a small set of frequently used voice commands. Under srec/config/en.us/grammars/ there are a number of .grxml files which define what words the recognizer can understand. You can define your own simple grammar (.grxml) here which, for example, only recognizes the digits on a phone keypad. To do this you can follow the syntax of any of the other .grxml files in the directory and then execute run_compile_grammars.sh which will produce a .g2g file from the .grxml file. There is also a voicetag/texttag file with extension .tcp which needs to point to the g2g file of your choice. You can find the .tcp files under the srec/config/en.us/tcp directory. run_SRecTestAudio.sh points to a tcp file which you can specify.trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com29tag:blogger.com,1999:blog-1785473796865119343.post-17379603873454712202011-02-24T10:56:00.001-08:002011-10-31T16:58:11.898-07:00Kinect audio reverse engineeringI did some work on getting the kinect audio hardware to work as part of openkinect/libfreenect a while back. Here's some quick notes on what I've figured out and how:<br />Once the audio firmware has been loaded the kinect sends 524 bytes to the xbox every 1ms, every tenth packet is short (60 bytes) but potentially preceded by an empty packet. The short packets appear to be non-audio data (maybe signaling of some sort) because if you exclude them the resulting data doesn't appear to have any gaps.<br /><br />The audio samples appear to be 32 bits signed each at 16khz (if you assume that sample rate then the FFT of the recorded data has the correct frequency).<br />The 4 channels seem to be transmitted in order left to right from the perspective of someone looking at the front of the kinect. The leftmost channel is transmitted first. 256 samples of each channel are transmitted before switching to the next channel.<br />if you stitch together disparate 256 sample blocks to reconstruct a given channel the data appears to be continuous. The plot below shows the captured 4-channel audio stream with the channels labeled from left to right as 1,2,3,4. You can see that the leftmost channel has the greatest amplitude corresponding to the fact that the speaker was placed closest to the leftmost microphone. I repeated the test with a speaker near the rightmost microphone and, as expected, channel 4 became the strongest.<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://4.bp.blogspot.com/-0lzwnVN6sjY/TWas5MtregI/AAAAAAAAAFw/FI53phn1UaQ/s1600/four_data_block.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 176px;" src="http://4.bp.blogspot.com/-0lzwnVN6sjY/TWas5MtregI/AAAAAAAAAFw/FI53phn1UaQ/s400/four_data_block.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5577335287210277378" /></a><br /><br />I was able to determine the information in the last paragraph by synthesizing a 700Hz sine wave in matlab and then playing it back at the kinect with a speaker nearest the leftmost microphone (as seen from the front of the kinect). I then captured the data stream coming back from the kinect while I played the sine wave using a beagle USB sniffer. I extracted the 524 byte blocks I suspected to be audio from the beagle dumps and then post processed them with a series of shell scripts before reading them into matlab and plotting the FFT of this audio as seen below:<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://1.bp.blogspot.com/-mHzJ3rNb2VA/TWasp7XBYfI/AAAAAAAAAFo/dJEbXcvWwUs/s1600/spectrum_700hz.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 179px;" src="http://1.bp.blogspot.com/-mHzJ3rNb2VA/TWasp7XBYfI/AAAAAAAAAFo/dJEbXcvWwUs/s400/spectrum_700hz.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5577335024853803506" /></a><br /><br />The frequency shown by the FFT is correctly 700Hz(approx.) This suggests that my interpretation of the audio format is correct.<br /> <br /><span style="font-weight:bold;">Firmware loading process</span><br /><br />I've managed to duplicate so far what I think is most of the init sequence-<br />I send all the same control transfers and bulk transfers as the Xbox,<br />as far as I can tell. My beagle480 confirms that I mirror the Xbox behavior for the most part. After completing a series of 512 byte bulk-out transfers which I<br />assume is some sort of bootstrapping firmware upload, the audio device<br />re-enumerates, I wait for that to happen and open the new audio<br />device, then send some more control and bulk transfers. So far ,so<br />good, this all follows what I see in the Xbox logs. At this point the<br />Xbox appears to send 12 cycles of ( 1 xfer: 0 byte iso IN, 8 xfers 4<br />bytes out) which I also duplicate perfectly. Now, the final step is a<br />very long stream of (1 xfer: 0 byte iso IN, 8 xfers 76 bytes out)<br />before eventually those 0 byte IN transfers become 524 bytes<br />transfers. Unfortunately it seems the content of those 76 byte OUT<br />transfers must matter because after trying all zeros I never get any<br />data back in my IN transfers (even after >5000 IN transfers). I have<br />some scripts I'll use to try generate code for all those OUT transfers<br />directly from the .tdc files.trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com8tag:blogger.com,1999:blog-1785473796865119343.post-19390764862206886482010-12-17T02:10:00.000-08:002012-10-20T02:30:41.579-07:00HOWTO: use the kinect as a mouse in linux<object width="480" height="385"><param name="movie" value="http://www.youtube.com/v/_8K9G7jUALI?fs=1&hl=en_US"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/_8K9G7jUALI?fs=1&hl=en_US" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="480" height="385"></embed></object><br /><br />In an <a href="http://www.keyboardmods.com/2010/12/howto-kinect-openninite-skeleton.html">earlier post</a> I explained how to get PrimeSense's NITE up and running and how to use the samples they provided. Now some people might be thinking "cool, but how can I use this?" I thought using NITE hand tracking to control the cursor would be a good and simple demonstration. <br /><br />The linux kernel provides a means to create userspace input drivers using a feature called uinput. If you compile your kernel with uinput enabled as a module you can then simply:<br /><code> modprobe uinput</code><br />to load the uinput module. Once the module is loaded you can use the piece of code I've embedded below to convert the coordinates output by the NITE code into actual mouse/cursor movement. In short:<br /><br />(1) download the code below<br />(2) save it as ~/kinect/NITE/Nite-1.3.0.17/Samples/SingleControl/main.cpp (you might want to back up the original) <br />(3)<code>cd ~/kinect/NITE/Nite-1.3.0.17 && make</code><br />(4)<span style="font-weight:bold;">Note: do the following as root or using sudo</span> <br /><br /><code>~/kinect/NITE/Nite-1.3.0.17/Samples/Bin/Sample-SingleControl</code> <br /><br />(5)Perform a focus gesture to start the hand tracking (check out my video above to see how to do that)<br /><br />At this point you should be able to do what I do in the video above. You can also extend the code to generate mouse clicks, keystrokes, etc. Have fun.<br />
At some point dropbox ate the public link for the source code I was using before and in the process of restoring from an old backup the formatting of the code below got a bit mangled... in any case here's a gist of what I salvaged:
<script src="https://gist.github.com/3922745.js"> </script>
trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com26tag:blogger.com,1999:blog-1785473796865119343.post-88378756383336454682010-12-10T17:54:00.000-08:002011-04-11T13:56:09.162-07:00HOWTO: Kinect + OpenNI/NITE skeleton tracking and gesture recognition in gentooThanks to the folks at PrimeSense libraries are now available for skeleton tracking and gesture recognition. <br /><span style="font-weight:bold;">UPDATE:</span> Check <a href="http://www.keyboardmods.com/2010/12/howto-use-kinect-as-mouse-in-linux.html">here</a> if you've gotten NITE working and want to try using the kinect as a Minority Report style mouse.<br /><span style="font-weight:bold;">UPDATE:</span>I've added a description of how to track multiple hands under the Sample-PointViewer description.<br /><br />Here's how I got things working in gentoo:<br /><br /><br />(1) <code>mkdir ~/kinect && cd ~/kinect</code><br />(2) <code>git clone https://github.com/OpenNI/OpenNI.git </code><br />(3) <code>cd OpenNI/Platform/Linux-x86/Build</code><br />(4) <code>make && sudo make install</code><br />(5) <code>cd ~/kinect/ </code><br />(6) <code>git clone https://github.com/boilerbots/Sensor.git</code><br />(7) <code>cd Sensor</code><br />(8) <code>git checkout kinect</code><br />(9) <code>cd Platform/Linux-x86/Build</code><br />(10) <code>make && sudo make install</code><br />(11) go to this page at openNI to download the latest NITE release for your platform: <a href="http://www.openni.org/downloadfiles/openni-compliant-middleware-binaries/34-stable">NITE download page</a> or for the impatient:<br /><a href="http://www.openni.org/downloadfiles/openni-compliant-middleware-binaries/stable/54-primesense-nite-beta-build-for-for-ubuntu-10-10-x86-32-bit-v1-3-0/download">32-bit</a><br /><a href="http://www.openni.org/downloadfiles/openni-compliant-middleware-binaries/stable/53-primesense-nite-beta-build-for-for-ubuntu-10-10-x64-64-bit-v1-3-0/download">64-bit</a><br /><span style="font-weight:bold;">UPDATE:</span> download links now point to openNI and should work again<br />(12)Save the NITE tarball to ~/kinect and untar it<br />(13)<code> cd ~/kinect/NITE/Nite-1.3.0.17/Data </code><br />(14)Open Sample-User.xml and replace the existing License line with the line below:<br /> <span style="font-weight:bold;">NOTE: this is case sensitive!</span> <br /><br />< License vendor="PrimeSense" key="0KOIk2JeIBYClPWVnMoRKn5cdY4="/><br /><br />(15)Repeat step 14 for Sample-Scene.xml and Sample-Tracking.xml<br />(16)Open Sample-User.xml and replace the existing MapOutputMode line with the line below. <br /><span style="font-weight:bold;">NOTE: this is case sensitive!</span><br /> <br /> < MapOutputMode xRes="640" yRes="480" FPS="30"/><br /><br />(17)Repeat step 16 for Sample-Scene.xml and Sample-Tracking.xml<br />(18)<code>niLicense PrimeSense 0KOIk2JeIBYClPWVnMoRKn5cdY4=</code><br />(19)<code>cd ~/kinect/NITE/Nite-1.3.0.17/</code><br />(20)<code>sudo ./install.bash</code><br />(21)<code>make && sudo make install</code><br />(22)<code>cd ~/kinect/NITE/Nite-1.3.0.17/Samples/Bin</code><br /><br />Now finally you should be sitting in a directory with all the sample binaries that you can play with. Here's what they should look like:<br /><br /><span style="font-weight:bold;">Sample-TrackPad:</span><br />This app will track your hand and show it's relative position on a grid. Run it and wave your hand,one of the squares on the grid should turn yellow to indicate your hand's location as seen below:<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://4.bp.blogspot.com/_lHQV7pC9HHg/TQLeIpT51-I/AAAAAAAAAD0/xgif3Plccdc/s1600/trackpad.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 267px;" src="http://4.bp.blogspot.com/_lHQV7pC9HHg/TQLeIpT51-I/AAAAAAAAAD0/xgif3Plccdc/s400/trackpad.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5549241930983528418" /></a><br /><br />you should also get some debug output in your console:<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://1.bp.blogspot.com/_lHQV7pC9HHg/TQLfBe-ParI/AAAAAAAAAD8/DQbuNTHauIY/s1600/trackpad_console.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 171px;" src="http://1.bp.blogspot.com/_lHQV7pC9HHg/TQLfBe-ParI/AAAAAAAAAD8/DQbuNTHauIY/s400/trackpad_console.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5549242907460856498" /></a><br /><br /><span style="font-weight:bold;">Sample-Players</span><br />This app demonstrates the skeletal tracking. After starting it up, move around or wave until your body changes to blue (subsequent players will be other colors, e.g. player 2 is green, 3 yellow,etc.). At this point your viewer window should look vaguely like this:<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://3.bp.blogspot.com/_lHQV7pC9HHg/TQhT8De43lI/AAAAAAAAAE8/rFxR8RUIqJg/s1600/SamplePlayersPreCal.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 267px;" src="http://3.bp.blogspot.com/_lHQV7pC9HHg/TQhT8De43lI/AAAAAAAAAE8/rFxR8RUIqJg/s400/SamplePlayersPreCal.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5550778831926255186" /></a><br /><br />and you should see something like this in your console:<br /><code>Look for pose<br />Found pose "Psi" for user 1<br /></code><br /><br />Now, hold your arms out to your sides bent 90 degrees at the elbows as shown below until a skeleton is overlayed on the image of your body:<br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://1.bp.blogspot.com/_lHQV7pC9HHg/TQhSarqcqYI/AAAAAAAAAE0/jCF6pjGPVIM/s1600/SamplePlayersCalPose.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 267px;" src="http://1.bp.blogspot.com/_lHQV7pC9HHg/TQhSarqcqYI/AAAAAAAAAE0/jCF6pjGPVIM/s400/SamplePlayersCalPose.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5550777159084976514" /></a><br /><br />At this point something like this should have appeared in your console:<br /><br /><code><br />Calibration started<br />Calibration done [1] successfully<br />Writing 217.596 50 50 78.4388 64.6762<br />Matching for existing calibration<br />Read 217.596 50 50 78.4388 64.6762<br /></code><br /><span style="font-weight:bold;">Sample-SingleControl</span><br />This seems to do some sort of gesture recognition and dynamically adjusts the camera resolution, so it's probably zooming in on an area of interest. When it starts out it asks you to perform a focus gesture. The NITE documentation doesn't seem to define what this would be but simply sticking one hand out in front of you seems to make it happy and you'll see the following output:<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://1.bp.blogspot.com/_lHQV7pC9HHg/TQLkGoSeZ2I/AAAAAAAAAEM/wySPu0s6X1c/s1600/SampleSingleControl.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 125px;" src="http://1.bp.blogspot.com/_lHQV7pC9HHg/TQLkGoSeZ2I/AAAAAAAAAEM/wySPu0s6X1c/s400/SampleSingleControl.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5549248493419128674" /></a><br /><br /><span style="font-weight:bold;">SamplePointViewer</span><br />This app does handtracking. <span style="font-weight:bold;">UPDATE:</span> to allow multiple hands to be tracked you will need to edit /usr/etc/primesense/XnVHandGenerator/Nite.ini by uncommenting the two config parameters it contains. Basically remove the semicolons at the start of each line so that Nite.ini looks like this:<br /><code><br />[HandTrackerManager]<br />AllowMultipleHands=1<br />TrackAdditionalHands=1<br /></code><br /><br />To persistently track different hands in your code you can make use of the XnVHandPointContext.nID in your OnPointUpdate callback.<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://3.bp.blogspot.com/_lHQV7pC9HHg/TQLlO4V3HWI/AAAAAAAAAEU/BpuVwy6Lark/s1600/SamplePointViewer.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 267px;" src="http://3.bp.blogspot.com/_lHQV7pC9HHg/TQLlO4V3HWI/AAAAAAAAAEU/BpuVwy6Lark/s400/SamplePointViewer.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5549249734678879586" /></a><br /><br /><span style="font-weight:bold;">Sample-Boxes</span><br />This example allows you to click one of three boxes, your hand motion is tracked by a slider and depending on the context, up, left, right gestures will be recognized.<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://1.bp.blogspot.com/_lHQV7pC9HHg/TQLnD84RCsI/AAAAAAAAAEc/YKDsvNJDbLQ/s1600/SampleBoxes.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 340px;" src="http://1.bp.blogspot.com/_lHQV7pC9HHg/TQLnD84RCsI/AAAAAAAAAEc/YKDsvNJDbLQ/s400/SampleBoxes.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5549251745941621442" /></a><br /><br /><span style="font-weight:bold;">Sample-CircleControl</span><br />Wave to make the border of the window turn green. Then I think you need to send a focus gesture and then if you trace out a circle in the air with your hand the onscreen circle will follow your hand as seen below. In other words if you draw a clockwise circle in the air, the clock hand will also spin clockwise and vice versa. For some reason, this appears to be annoyingly inconsistent.<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://4.bp.blogspot.com/_lHQV7pC9HHg/TQLorD5bIUI/AAAAAAAAAEk/dNne1y0somQ/s1600/SampleCircleControl.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 277px;" src="http://4.bp.blogspot.com/_lHQV7pC9HHg/TQLorD5bIUI/AAAAAAAAAEk/dNne1y0somQ/s400/SampleCircleControl.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5549253517352051010" /></a><br /><br /><span style="font-weight:bold;">Sample-SceneAnalysis</span><br />This seems to just do player detection without skeleton tracking:<br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://3.bp.blogspot.com/_lHQV7pC9HHg/TQLqd1ajxTI/AAAAAAAAAEs/tlsud8loHjA/s1600/SampleSceneAnalysis.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 277px;" src="http://3.bp.blogspot.com/_lHQV7pC9HHg/TQLqd1ajxTI/AAAAAAAAAEs/tlsud8loHjA/s400/SampleSceneAnalysis.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5549255489149453618" /></a>trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com133tag:blogger.com,1999:blog-1785473796865119343.post-78052338862800091022010-10-18T18:46:00.000-07:002010-10-31T21:06:30.363-07:00The magic keyboard<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://1.bp.blogspot.com/_lHQV7pC9HHg/TMspeDmPIOI/AAAAAAAAACY/7kbEwcm4gz8/s1600/magic_keyboard_connected.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 153px;" src="http://1.bp.blogspot.com/_lHQV7pC9HHg/TMspeDmPIOI/AAAAAAAAACY/7kbEwcm4gz8/s400/magic_keyboard_connected.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5533562163493282018" /></a><br /><br />As I hinted at in an <a href="http://www.keyboardmods.com/2010/08/magic-trackpad-with-gentoo-linux.html">earlier post</a>, the magic trackpad hardware is very well designed and bears a strong resemblance to the Fingerworks series of input devices. Now, I've decided to extend the functionality of this device by building a wireless multitouch keyboard using two magic trackpads. Currently, this keyboard will only work in linux since it relies on my extensive modifications to the linux kernel driver for the magic trackpad.<br /> <br />The first image above shows my two magic trackpads with plastic overlays to indicate key placement. It turns out the magic trackpad will still detect contact through a thin insulator placed on top of its surface. The overlays are simply standard laser-printable overhead projector transparencies. I originally considered laser etching the surface of the trackpads at TechShop, but I think I will hold off until I've settled on a key arrangement that I like. Here's a picture showing how I originally planned to lay out the keys relative to my hand. <br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://4.bp.blogspot.com/_lHQV7pC9HHg/TMswEODurmI/AAAAAAAAACg/SF-yvA_DEoA/s1600/hand_placement.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 348px;" src="http://4.bp.blogspot.com/_lHQV7pC9HHg/TMswEODurmI/AAAAAAAAACg/SF-yvA_DEoA/s400/hand_placement.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5533569416206134882" /></a><br /><br />For now, I opted against the layout above in order to simplify the code that converts coordinates to keycodes. The layout I'm currently using consists simply of three concentric circles, so the code simply checks which circles contain the current touch to determine the corresponding "row" and then compares the X coordinate to a table of thresholds to determine the "column". I created the layout seen in the other pictures using Illustrator. I'll post links to PDFs for the left and right halves in case anyone feels like trying this out.<br /><br />The keyboard was intended to be placed on the lap to minimize elbow flexion. To hold the halves together I used two strips of rubberized truck tarp (very strong stuff used for a windsurfing roof rack). I'm still working out how to make this attachment adjustable without being bulky.<br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://4.bp.blogspot.com/_lHQV7pC9HHg/TMsn8nfsRhI/AAAAAAAAACQ/fE3fp-szMgU/s1600/magic_keyboard_disconnected.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 199px;" src="http://4.bp.blogspot.com/_lHQV7pC9HHg/TMsn8nfsRhI/AAAAAAAAACQ/fE3fp-szMgU/s400/magic_keyboard_disconnected.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5533560489502328338" /></a><br /><br />The transparent blobs seen on the home row of the keyboard in the previous images is clear nail polish. A few coats of this allowed creating a raised ridge to facilitate finding the home row while touch typing. This is actually the principal drawback of a multitouch keyboard- the inherent lack of tactile feedback. The nail polish helps somewhat, but I'm still thinking of other ways to mitigate this issue. The image below shows the nail polish blobs close up.<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://1.bp.blogspot.com/_lHQV7pC9HHg/TMsyHrGhSEI/AAAAAAAAACo/k4trrSXCjJc/s1600/magic_keyboard_side_view.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 299px;" src="http://1.bp.blogspot.com/_lHQV7pC9HHg/TMsyHrGhSEI/AAAAAAAAACo/k4trrSXCjJc/s400/magic_keyboard_side_view.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5533571674565331010" /></a><br /><br />I currently have preliminary mixed typing/mousing support and a few simple gestures. I also implemented certain behaviors which I think make the use of a touchpad far more ergonomic:<br />(1)Moving the cursor and single clicking is done with two fingers rather than one. <br />This helps eliminate the annoyance of inadvertent cursor movement or clicks when a single finger alights on the trackpad. I also think it's more ergonomic, since for me using a single finger requires more effort than using two.<br />(2)Double clicking is done with a three finger tap- this eliminates the hassle of properly timing a double click.<br />(3)Scrolling is done with four fingers instead of two. This allows just plopping your entire hand down on the trackpad to scroll rather than contorting it to extend only two fingers.<br /><br /><span style="font-weight:bold;">Todo list:</span><br /><br /><span style="font-style:italic;">Software:</span><br />Add support for modifiers (i.e. shift,ctrl,alt,meta) using chords. <br /><br />Write a GUI to rearrange key placement which generates suitable header files based on user's choices. Perhaps eventually come up with a means to allow reconfiguration on the fly.<br /><br /><span style="font-style:italic;">Hardware:</span><br />Come up with better attachment and adjustment system for the straps which connect the two halves of the keyboard<br /><br />To give some context for this project- here's some images of the devices the magic trackpad derives from - the iGesture and touchstream. You can see that the magic trackpad is significantly smaller than both the iGesture and touchstream which is why I think that modifiers will have to be handled only via chords (no room for separate keys). In the images below there's a weird blob on the USB cords of the iGesture and touchstream- those are my improvised hot-glue strain reliefs.<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://3.bp.blogspot.com/_lHQV7pC9HHg/TM2zfV12IsI/AAAAAAAAACw/A5xHs2edrRw/s1600/side_by_side.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 152px;" src="http://3.bp.blogspot.com/_lHQV7pC9HHg/TM2zfV12IsI/AAAAAAAAACw/A5xHs2edrRw/s400/side_by_side.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5534276868128514754" /></a><br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://2.bp.blogspot.com/_lHQV7pC9HHg/TM20VaOl77I/AAAAAAAAAC4/gIq2f025O0k/s1600/on_top.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 252px;" src="http://2.bp.blogspot.com/_lHQV7pC9HHg/TM20VaOl77I/AAAAAAAAAC4/gIq2f025O0k/s400/on_top.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5534277797018988466" /></a><br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://2.bp.blogspot.com/_lHQV7pC9HHg/TM21EIAi48I/AAAAAAAAADA/5ARwAr9MYNk/s1600/igesture_side_by_side.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 205px;" src="http://2.bp.blogspot.com/_lHQV7pC9HHg/TM21EIAi48I/AAAAAAAAADA/5ARwAr9MYNk/s400/igesture_side_by_side.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5534278599582081986" /></a>trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com30tag:blogger.com,1999:blog-1785473796865119343.post-38164346786073134682010-08-30T11:34:00.000-07:002012-10-20T03:21:41.273-07:00Magic Trackpad using ten fingers and with gentoo linux support<div class='post-body entry-content' id='post-body-3816434678607313468' itemprop='articleBody'></div>
<object width="480" height="385"><param name="movie" value="http://www.youtube.com/v/6nuWSpzHHOA?fs=1&hl=en_US"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/6nuWSpzHHOA?fs=1&hl=en_US" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="480" height="385"></embed></object><br /><br />
I picked up a magic trackpad this past friday and decided to see what it was capable of under linux. The video above shows that the magic trackpad hardware provides a great deal of information to the host operating system. As I'd hoped it seems to be nearly as capable as a fingerworks igesture or touchpad in some regards. In the video you can see that the trackpad is able to: detect 10 fingers, track finger contact elliptical size (i.e. along two axes) and orientation, do all the above smoothly at a high sample rate. <br /><br />
The video was created by streaming the debugfs file entry corresponding to the trackpad into a pygame application. The pygame code parses the apple protocol packets to determine ellipse sizes, positions and orientations and then blits them to the screen on transparent surfaces. This code is by no means pretty, in fact "quick hack" is probably a better description. Any suggestions, enhancements, criticisms,etc. are welcome.<br /><br />If you want to try the code below yourself, you first need to make sure you have debugfs set up properly. Assuming you enabled debugfs when building your kernel, make sure debugfs is mounted and if it's not, mount it with something like:
<br /><code> mount -t debugfs none /sys/kernel/debug/</code>
<br />Now, if you look in /sys/kernel/debug/hid/ you should see a directory whose name corresponds to the address of your magic trackpad. If you switch into this directory you'll see an events file which you can then read with cat, tail, or the python script below.
<script src="https://gist.github.com/3922875.js"> </script>
Note that support for the magic trackpad is still in a very preliminary stage, so getting things working is fairly involved. I'll describe how to play around with the hardware at a low level in the description below. <br /><br />First, start with a kernel new enough to have magic-mouse support. In my case, I chose gentoo-2.6.35-r5. Now grab the multitouch branch of Chase Douglas's debian git repository. You'll need to copy or merge several files from Chase's kernel source to your own: <br />/drivers/hid/hid-magicmouse.c <br />/drivers/hid/hid-core.c <br />/drivers/hid/hid-ids.h<br />possibly a few others<br /><br />If you feel like using the same kernel as Chase, just skip straight to compiling his source tree. Once the kernel is built and you've booted into it, it's time to play around with the trackpad. <br /><br />Depending on the version of bluez you're using, the procedure for pairing to an HID device will vary. In bluez 3.32 you would first set the trackpad as a trusted device using DBUS, which you could either do programmatically or using a graphical tool like d-feet to call the method<br /><code> /org/bluez/hci0/org.bluez.Adapter.SetTrusted("bluetooth address of your trackpad").</code><br /><br /> Once the device is set as trusted you would need to actually setup a pairing as root:<br /><code> passkey-agent --default 0000</code><br />then use d-feet or DBUS CLI to call CreateDevice("BluetoothAddress"). At this point if you call ListDevices() you should a new device corresponding to the trackpad.<br /> Once you've paired with the trackpad and loaded the hid-magicmouse kernel module you should see messages in the system log indicating that a new input device has been registered.<br /><code><br />input: Apple Wireless Trackpad as /class/input/input6<br />magicmouse 0005:05AC:030E.0005: input,hidraw4: BLUETOOTH HID v1.60 Mouse [Apple Wireless Trackpad] on 00:27:48:09:63:60<br />input: Apple Wireless Trackpad as /class/input/input7<br /></code><br /><br />Note that you'll also want an up to date installation of evdev, otherwise you might get messages like this:<br /><code><br />evdev.c(EVIOCGBIT): Suspicious buffer size 511, limiting output to 64 bytes. See http://userweb.kernel.org/~dtor/eviocgbit-bug.html<br /></code><br /><br />If you want to see the touch reports the trackpad produces you can use the evtest application. In gentoo this is provided by the joystick ebuild. Now if you do<br /><code><br />]$ evtest /dev/input/event6<br />Input driver version is 1.0.0<br />Input device ID: bus 0x5 vendor 0x5ac product 0x30e version 0x160<br />Input device name: "Apple Wireless Trackpad"<br />Supported events:<br /> Event type 0 (Sync)<br /> Event type 1 (Key)<br /> Event code 272 (LeftBtn)<br /> Event code 325 (ToolFinger)<br /> Event code 330 (Touch)<br /> Event code 333 (Tool Doubletap)<br /> Event code 334 (Tool Tripletap)<br /> Event code 335 (?)<br /> Event type 3 (Absolute)<br /> Event code 0 (X)<br /> Value 3097<br /> Min -2909<br /> Max 3167<br /> Event code 1 (Y)<br /> Value 2238<br /> Min -2456<br /> Max 2565<br /> Event code 48 (?)<br /> Value 0<br /> Min 0<br /> Max 255<br /> Event code 49 (?)<br /> Value 0<br /> Min 0<br /> Max 255<br /> Event code 52 (?)<br /> Value 0<br /> Min -32<br /> Max 31<br /> Event code 53 (?)<br /> Value 0<br /> Min -2909<br /> Max 3167<br /> Event code 54 (?)<br /> Value 0<br /> Min -2456<br /> Max 2565<br /> Event code 57 (?)<br /> Value 0<br /> Min 0<br /> Max 15<br /> Event type 4 (Misc)<br /> Event code 3 (RawData)<br />Testing ... (interrupt to exit)<br /><br /></code><br />Now if you touch with two fingers you'll see a slew of output which should include something like:<br /><code><br />Event: time 1283194689.870573, -------------- Config Sync ------------<br />Event: time 1283194689.870586, type 1 (Key), code 330 (Touch), value 0<br />Event: time 1283194689.870587, type 1 (Key), code 333 (Tool Doubletap), value 0<br />Event: time 1283194689.870589, -------------- Report Sync ------------<br /></code>trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com5tag:blogger.com,1999:blog-1785473796865119343.post-37831001861828668192010-05-23T15:15:00.000-07:002010-05-23T15:30:49.361-07:00Mac Mighty Mouse trackball embedded interface hack<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://4.bp.blogspot.com/_lHQV7pC9HHg/S_mrkJ1oCGI/AAAAAAAAABc/WPkgdyd-nuQ/s1600/mighty_mouse_pinout_big.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 129px;" src="http://4.bp.blogspot.com/_lHQV7pC9HHg/S_mrkJ1oCGI/AAAAAAAAABc/WPkgdyd-nuQ/s400/mighty_mouse_pinout_big.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5474595459650029666" /></a><br />Inspired by input nirvana from geekhack I got a free mighty mouse from a generous donor and went about interfacing to it. There's plenty of teardowns already available on the net, so I'll skip opening the mouse and just show this picture with the pinout to the trackball. Note that it's even simpler than most typical trackballs in that it doesn't make use of quadrature encoding for the two axes. The trackball actually has four distinct pulsed outputs corresponding to the four little wheels surrounding the ball. When rolling in any of the four cardinal directions (left,right, up,down) only one of those little wheels is actually spinning. Decoding the output of the overall trackball simply requires detecting pulses on each of the four pins. If the state of any pin changes from what it was during the last polling interval then you've moved in that direction. Instead of polling you could also just set up four pin change interrupts.<br /><br />To test things out I hooked up the trackball to one of the new SMT connector boards from Schmartboard (seen in the picture) and from there to an atmega32u4.<br /><br />At the moment I haven't decided whether to make a small wireless mouse out of this or integrate it into my kinesis project. I'm leaning heavily towards the former, since the keyboard already has 4 pointing devices.<br /><br /> Total time taken from disassembly to working USB trackball mouse (including the time for this quick writeup): ~2hours 10 mins.trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com5tag:blogger.com,1999:blog-1785473796865119343.post-90359180263075422992010-05-02T12:58:00.000-07:002010-05-02T13:55:29.565-07:00forearm massager for climbers (and keyboard users)<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://1.bp.blogspot.com/_lHQV7pC9HHg/S93kyi_nh9I/AAAAAAAAABE/-oAt5VtIh20/s1600/2010-05-02+13.29.31.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 299px;" src="http://1.bp.blogspot.com/_lHQV7pC9HHg/S93kyi_nh9I/AAAAAAAAABE/-oAt5VtIh20/s400/2010-05-02+13.29.31.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5466777079735879634" /></a><br /><br /><br /><object width="480" height="385"><param name="movie" value="http://www.youtube.com/v/lDEIhyj4Y1g&hl=en_US&fs=1&"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/lDEIhyj4Y1g&hl=en_US&fs=1&" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="480" height="385"></embed></object><br /> <br />OK. the link here to keyboards is probably even more tenuous than my bluetooth scale but whatever. In the quest to relieve perpetually tense forearms due to bouldering + heavy keyboard usage I came up with this massager using easily obtained components. The total parts list with estimated price: (all items except for skateboard wheels were obtained at Lowe's)<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://4.bp.blogspot.com/_lHQV7pC9HHg/S93jvl9WSYI/AAAAAAAAAA8/_OVqkozvVvE/s1600/2010-05-02+12.32.49.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 299px;" src="http://4.bp.blogspot.com/_lHQV7pC9HHg/S93jvl9WSYI/AAAAAAAAAA8/_OVqkozvVvE/s400/2010-05-02+12.32.49.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5466775929480432002" /></a><br /><br /><br />4 skateboard wheels with bearings: $10 at target, but you can easily wind up spending far more for fancier wheels like the 70mm kryptonics wheels in most of these pictures. <br /><br />2 4" wood clamps: 2@ $4.99 ea. = $9.98<br />2 5/16" x 8" machine bolts: 2@ $1.15 = $2.30<br />3 11"x1/4" threaded rods: 3@ $0.97 = $2.91<br />8 plastic spacers: 4 x (2 @ $1.09)= $4.36<br />2 pieces of thicker(1.5"x1.5"x8") wood: ? this was just scrap wood I had lying around, not sure how much it would cost.<br />(optional) 1 piece thinner wood (0.75"x1.5"x8") also scrap wood and not really necessary, just added to allow clamping in one other place.<br /><br />4 pieces of 1/4" thick ABS plastic: ? scrap plastic I had lying around, not sure how much it would cost but small sheets of lexan or acrylic at Lowe's are pretty cheap.<br /><br /><br />Assembly:<br /><br />Putting things together is pretty easy: thread one wheel onto one of the 8" bolts, followed by 2 plastic spacers. <br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://3.bp.blogspot.com/_lHQV7pC9HHg/S93mGjn1aeI/AAAAAAAAABM/BL0qF5Xtoac/s1600/2010-05-02+12.33.35.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 299px; height: 400px;" src="http://3.bp.blogspot.com/_lHQV7pC9HHg/S93mGjn1aeI/AAAAAAAAABM/BL0qF5Xtoac/s400/2010-05-02+12.33.35.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5466778523013573090" /></a><br /><br />Next, add another wheel followed by two more spacers and a piece of ABS plastic. <br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://4.bp.blogspot.com/_lHQV7pC9HHg/S93mpfxbL1I/AAAAAAAAABU/iEFPp5jhJ9M/s1600/2010-05-02+12.34.22.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 299px; height: 400px;" src="http://4.bp.blogspot.com/_lHQV7pC9HHg/S93mpfxbL1I/AAAAAAAAABU/iEFPp5jhJ9M/s400/2010-05-02+12.34.22.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5466779123275476818" /></a>trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com9tag:blogger.com,1999:blog-1785473796865119343.post-15072240813935889072010-05-02T09:56:00.000-07:002012-05-21T01:04:47.925-07:00bluetooth wireless bathroom scale with fatsecret integration<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://2.bp.blogspot.com/_lHQV7pC9HHg/S925osR4F3I/AAAAAAAAAAU/vJc8Cfk-AIs/s1600/2010-05-02+10.37.09.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 299px; height: 400px;" src="http://2.bp.blogspot.com/_lHQV7pC9HHg/S925osR4F3I/AAAAAAAAAAU/vJc8Cfk-AIs/s400/2010-05-02+10.37.09.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5466729631429695346" /></a><br /><br />Not exactly a keyboard, but I'd argue that a datalogging scale is an imput device... In any case, this is a project I did a while ago (~2 years ago). Start with a cheap bathroom scale from ikea, then strip out all the hardware except for the load cells. Add in the following:<br /><br />microcontroller (AT90USB162) <br />roving networks bluetooth serial module<br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://2.bp.blogspot.com/_lHQV7pC9HHg/S926SxrThqI/AAAAAAAAAAk/PlYsgh9DCT4/s1600/2010-05-02+10.36.22.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 299px;" src="http://2.bp.blogspot.com/_lHQV7pC9HHg/S926SxrThqI/AAAAAAAAAAk/PlYsgh9DCT4/s400/2010-05-02+10.36.22.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5466730354433033890" /></a><br /><br />24 bit ADC (AD7799) <br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://1.bp.blogspot.com/_lHQV7pC9HHg/S9254n6TAKI/AAAAAAAAAAc/d5dSf7vkBz0/s1600/2010-05-02+10.36.37.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 299px;" src="http://1.bp.blogspot.com/_lHQV7pC9HHg/S9254n6TAKI/AAAAAAAAAAc/d5dSf7vkBz0/s400/2010-05-02+10.36.37.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5466729905134960802" /></a><br /><br />4 AA batteries +linear regulator (don't want switching noise here)+ input ORing diodes to allow powering from batteries or USB without damaging either (handy during development)<br /><br />After assembling the above hardware, the microcontroller code was fairly straightforward- when a user taps a corner of the scale I take repeated ADC readings until the "running STD" (STD of last N measurements) drops below a certain threshold. At this point the scale will have a stable zero point to normalize the subsequent weighing. When the user stands on the scale I do the same thing- take ADC readings till STD< threshold. I chose the threshold such that the delay before convergence was short, at the expense of some accuracy. Left to run long enough the scale was able to detect the difference between my bodyweight and my bodyweight plus one US quarter. With the current threshold the scale appears to be repeatably accurate to within 0.1 lbs which for my purposes is acceptable. I was actually pretty happy to get that level of performance considering I didn't lay out a board for the ADC and just used a schmartboard instead.<br /><br />One other metric I was happy about is the battery life - approximately > 9 months when used once or twice a day. I coded things such that the microcontroller is in a deep sleep until the user touches a corner of the scale. I could have taken a hardware approach and actually disconnected the uC power with a transistor until the user touched the scale, thus arming a delay circuit, but the software approach yielded more than acceptable results. The uC current draw is in the microamp range when sleeping and I use transistors to disconnect power from the bluetooth module and ADC while sleeping so I probably shouldn't have been surprised.<br />I think for a future revision of this project I'll actually spin my own main board. This would allow for a much more compact footprint and allow me to power from a smaller 3.3V source instead of the stack of 4 AAs. This latter arrangement was needed to avoid extensive rework of the olimex board where certain bits of hardware expect 5V.<br /><br />On the PC side of this project I'm using bluez rfcomm server to listen for inbound serial connections and then run some other code when a connection is established. That other code consists of python scripts that insert the current reading in a sqlite database (which is then visible from my ruby-on-rails site) and submits the reading to fatsecret via its API. This data submission is done using OAuth, so the whole process requires no human intervention and I avoid any ugly hacks involving plaintext saved passwords.<br /><br />Fatsecret is great because: <br />(1) it has an API <br />(2) it has an android version <br />(3) it can scan barcodes to fetch associated calories and nutritional content<br />Here's a screenshot from my phone showing a graph of weight readings for the last few months.<br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://1.bp.blogspot.com/_lHQV7pC9HHg/S922lLvVnPI/AAAAAAAAAAM/3yDGKn-xSis/s1600/calorie_counter.png"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 225px; height: 400px;" src="http://1.bp.blogspot.com/_lHQV7pC9HHg/S922lLvVnPI/AAAAAAAAAAM/3yDGKn-xSis/s400/calorie_counter.png" border="0" alt=""id="BLOGGER_PHOTO_ID_5466726272620403954" /></a><br /><br />One final aspect of the scale which I find interesting is the means by which the user interacts with it. I intentionally made it so that scale does not have an LCD to show the current weight reading. Day to day fluctuations are not reflective of what really matters- long term trends. Usage of the scale thus boils down to the following:<br />(1) User taps lower corner of scale to initiate auto-zero sequence<br />(2) The scale turns on its sole LED to indicate that the zero process is ongoing<br />(3) The scale switches its status LED from solid red to blinking to indicate the zero process is complete.<br />(4) User stands on scale until LED turns off, indicating the reading has stabilized.<br />(5) User reviews cumulative weighing data at some later time (either via phone, or at their PC)<br /><br />For fun, I added an additional means of relaying data to the user- spoken weights. By feeding the stabilized weight reading to festival (text to speech engine), the scale can "announce" the most recent weighing.trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com13tag:blogger.com,1999:blog-1785473796865119343.post-19175217516929603932010-04-21T15:05:00.000-07:002010-05-02T11:44:44.522-07:00wireless split kinesis contour<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://3.bp.blogspot.com/_lHQV7pC9HHg/S93HG62gXCI/AAAAAAAAAA0/sQSXWpHfGgw/s1600/2010-04-21+11.42.15.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 299px;" src="http://3.bp.blogspot.com/_lHQV7pC9HHg/S93HG62gXCI/AAAAAAAAAA0/sQSXWpHfGgw/s400/2010-04-21+11.42.15.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5466744444388662306" /></a><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://4.bp.blogspot.com/_lHQV7pC9HHg/S93G0yp9sAI/AAAAAAAAAAs/dHN-0hvmh74/s1600/2010-04-21+11.41.58.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 299px;" src="http://4.bp.blogspot.com/_lHQV7pC9HHg/S93G0yp9sAI/AAAAAAAAAAs/dHN-0hvmh74/s400/2010-04-21+11.41.58.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5466744132950929410" /></a><br /><br /><br />I finally have almost all the hard issues sorted out and I've assembled the left side of my heavily customized kinesis keyboard. Here's what's currently working:<br /><br />Seamless wired to wireless operation: <br />Recharges when it's connected via USB, but can still be used while doing so. When disconnected, the wireless interface starts working immediately. When reconnected the wireless is disabled.<br /><br />Support for 4 pointing devices:<br />The final keyboard will have 1 trackpoint + 1 touchpad per side. I've verified that I can support 3, and I'm 99% certain that 4 should work. I'm using synaptics touchpads and trackpoints directly connected to my microcontroller. This means I can implement distinct functionality for left and right sides- e.g. one side scrolls, the other side moves the cursor.<br /><br />Runtime selectable keymaps:<br />I'm using a microcontroller with plenty of flash, so I have room to spare for extra keymaps (e.g. QWERTY,Dvorak,etc.). I've verified the ability to switch between them on the fly.<br /><br />Remaining Tasks<br /><br />At this point I need to:<br />solder the right half of the key matrix to the controller<br />build the plastic housings for each half<br />integrate touchpads and trackpoints into plastic housings<br />try to ruggedize everything as much as possible<br /><br />The mechanical stuff has always proved to be the hardest for me. In the picture above the keyboard frame is still in one piece because I'm still measuring out how to integrate 2 touchpads and 2 trackpoints- there's barely enough room. A very similar issue is why I ended up postponing further work on the wireless alphagrip; it was just too hard to fit everything back inside. When I get back to working on the alphagrip, my plan is to make my own more compact keymatrix, perhaps consisting of individually soldered keyswitches to free up room vs. a full PCB.trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com25tag:blogger.com,1999:blog-1785473796865119343.post-57830524291596958362009-08-20T02:36:00.000-07:002009-08-20T04:40:02.552-07:00nintendo wiimote with motionplusI've been messing around a bit with combining gyro data from the motion plus module and accelerometer data from the base wiimote. I've been using a Kalman filter to merge gyro + accelerometer for pitch and roll and this seems to work fairly well. The contribution of the gyro acts to smooth the accelerometer data, at least for the two rotations which are not completely perpendicular to the force of gravity (pitch and roll). The accelerometer data appears to be completely useless for yaw. The gyro data does a decent job of measuring yaw but it drifts, and the worthless accelerometer data does not help in this case. Perhaps just applying some sort of high-pass filter to the gyro yaw might allow salvaging some useful information. trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com5tag:blogger.com,1999:blog-1785473796865119343.post-42871082178964264382009-08-20T02:00:00.000-07:002009-08-20T02:32:19.801-07:00wireless alphagrip progress<p>I've finally managed to resolve some timing problems with the RF link; I now have my heavily customized wireless alphagrip mostly working. Up till now I was having issues with jerky trackball movement when using the alphagrip wirelessly, now I've managed to sort that out, cursor movement is smooth, keystrokes are not getting duplicated anymore either. Here's the features I've added so far: </p><ul><li>Wireless and wired operation- I retrofitted the alphagrip with a lithium ion battery and USB charger IC, and ultra-low quiescent current regulator in such a way that when you plug it in the battery recharges but the alphagrip is still fully functional. When you disconnect, the alphagrip begins to work wirelessly by talking to the special USB dongle I'm using (NOT bluetooth). </li><li>Scroll wheel emulation using the trackball- enabling numlock will lock out the trackball x-axis and all buttons except the mouse buttons. In other words when you enable numlock, the trackball no longer moves the cursor, it just acts as a scroll wheel. </li><li>Firmware upgrade-ability on the fly- a special key sequence kicks the alphagrip into its bootloader at which point new firmware can be loaded without having to physically hit a reset button tied to the microcontroller. </li><li>Real modifier keys- when you push Ctrl, Shift, etc. they are actually transmitted to the host. This allows things like Ctrl/Shift click multiple selection which aren't possible with the original alphagrip. </li></ul><p>At this point the remaining tasks before I can deem this keyboard fully functional are mostly mechanical issues. Solder joints at the Ctrl key are bad, need to be retouched, but this is easy. The hard problem that I've yet to resolve is cramming all the new hardware components into the very limited amount of space inside the alphagrip. There is just barely enough room for the new microcontroller board, RF transceiver, battery+regulator, tons of wires. I may just try to dremel out any plastic that isn't absolutely critical in order to free up some space. Once things are more finalized I'll post some pics.</p><p><br /></p><p><br /></p>trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com8tag:blogger.com,1999:blog-1785473796865119343.post-53159358954448259542009-08-02T05:01:00.001-07:002010-10-31T21:20:02.200-07:00Wacom bamboo pen and touchThis device is best summed up as a wacom tablet with some minimal touch functionality thrown in as an afterthought. The stylus+eraser functionality behaves as one would expect and you can seamlessly switch from stylus to finger on the tablet surface. So far, so good. Unfortunately, the touch functionality is not smooth at all when compared to the <a href="http://www.keyboardmods.com/2010/08/magic-trackpad-with-gentoo-linux.html">magic trackpad</a>, <a href="http://www.keyboardmods.com/2009/08/fingerworks-igesture.html">iGesture</a>, or even the <a href="http://www.keyboardmods.com/2009/08/cirque-smartcatpro-touchpad.html">cirque smartcat.</a> The multitouch functionality in particular is very limited, with only a small set of available gestures and only two simultaneous touch points supported.trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com2tag:blogger.com,1999:blog-1785473796865119343.post-22821335685313652992009-08-02T05:00:00.001-07:002009-08-02T05:00:29.632-07:00Anthrotronix acceleglovesoon...trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com3tag:blogger.com,1999:blog-1785473796865119343.post-83087681252603011782009-08-02T04:40:00.000-07:002009-08-02T04:46:29.516-07:00where to buy ergonomic gear in the bay areaIf you live in the bay area, http://www.askergoworks.com/ is the only place I know of where you can actually try out various ergonomic keyboards,mice,chairs,etc. before you buy them. They have a showroom in Palo Alto where you can check out their products by appointment. One cool thing I discovered- they are a reseller for a custom office chair company- you can specify every piece of the chair to your liking: back, seat, fabric, hydraulics,etc.<br /><br />I have no connection to askergoworks.com other than purchases I made there: both my kinesis keyboards+footpedal, and both my cirque touchpads.trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com5tag:blogger.com,1999:blog-1785473796865119343.post-53192881853467037272009-08-02T04:35:00.001-07:002009-08-02T04:37:09.616-07:00evoluent vertical mousesoon...trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com2tag:blogger.com,1999:blog-1785473796865119343.post-75176706243421883132009-08-02T04:34:00.001-07:002009-12-21T15:37:09.226-08:00gyration airmouseThis mouse is the more compact, but less ergonomic (IMHO) version of the go pro. The kit consists of a mouse an RF transceiver dongle which hides in the mouse itself when not in use, and a zippered pouch for the mouse. Standard alkaline batteries are needed (the mouse comes with two). The key points for me: unless you have very small hands, the mouse is awkward to hold up in the air. The movement tracking is really not that great especially if you inadvertently roll the mouse at all. It's too easy to brush your fingers under the optical sensor of the mouse while holding it up in the air. There does not appear to be a way to disable the traditional/optical mouse functionality, which makes aforementioned problem even worse. My main interest at this point is tearing it down to see if some sort of wearable, wireless pointing device can be hacked out of it since it is so small and lightweight.trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com2tag:blogger.com,1999:blog-1785473796865119343.post-22536110648021999222009-08-02T04:32:00.000-07:002009-08-02T04:37:56.031-07:00alphagrip chording keyboard + trackballThis is my current input device with which I'm mostly satisfied. I'll post soon about some of my customizations.trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com2tag:blogger.com,1999:blog-1785473796865119343.post-15990047279364387932009-08-02T04:30:00.000-07:002010-10-31T21:11:37.628-07:00Cirque smartcat/pro touchpadThe only reason to buy the pro, in my view, is to get a distinct third button for linux paste capability.trtghttp://www.blogger.com/profile/11883202530307275379noreply@blogger.com1