Friday, October 31, 2014

Terry 2.0 includes ROS!

What started as a little tinker around the edges has resulted in many parts of Terry being updated. The Intel j1900 motherboard is now mounted in the middle of the largest square structure, and SSD is mounted (the OCZ black drive at the bottom), yet another battery is mounted which is a large external laptop supply, the Kinect is now mounted on the pan and tilt mechanism along with the 1080p webcam that was previously there. The BeagleBone Black is moved to its own piece of channel and a breadboard is sunk into the main 2nd top level channel.


I haven't cabled up the j1900 yet. On the SSD is Ubuntu and ROS including a working DSLAM (strangely some fun and games getting that to compile and then to not segv right away).

I used 3 Actobotics Beams, one IIRC is a 7.7 incher and two shorter ones. The long beam actually lines on for the right side of the motherboard that you see in the image. The beam is attached with nylon bolts and has a 6.6mm standoff between the motherboard and the beam to avoid any undesired electrical shorts. With the two shorter beams on the left side of the motherboard it is rather securely attached to Terry now. The little channel you see on the right side up a little from the bottom is there for the 7.7 inch beam to attach to (behind the motherboard) and there is a shorter beam on this side to secure the floating end of the channel to the base channel.



The alloy structure at the top of the pan and tilt now has a Kinect attached. I used a wall mount plastic adaptor which with great luck and convenience the nut traps lined up to the actobotics holes. I have offset the channel like you see so that the center of gravity is closer to directly above the pan and tilt. Perhaps I will have to add some springs to help the tilt servo when it moves the Kinect back too far from the centre point. I am also considering a counter balance weight below the tilt which would also work to try to stabilize the Kinect at the position shown.



I was originally planning to put some gripper on the front of Terry. But now I'm thinking about using the relatively clean back channel to attach a threaded rod and stepper motor so that the gripper can have access to the ground and also table top. Obviously the cameras would have to rotate 180 degrees to be able to see what the gripper was up to. Also for floor pickups the tilt might have to be able to handle a reasonable downward "look" without being too hard on the servo.

There were also some other tweaks. A 6 volt regulator is now inlined into a servo extension cable and the regulator is itself bolted to some of the channel. Nice cooling, and it means that the other end of that servo extension can take something like 7-15v and it will give the servo the 6v it wants. That is actually using the same battery pack as the drive wheels (8xAA).

One thing that might be handy for others who find this post, the BeagleBone Black Case from sparkfun attaches to Actobotics channel fairly easily. I used two cheesehead m3 nylocks and had to force them into the enclosure. The nylocks lined up to the Actobotics channel and so the attachment was very simple. You'll want a "3 big hole" or more bit of channel to attach the enclosure to. I attached it to a 3 bit holer and then attaced that channel to the top of Terry with a few threaded standoffs. Simplifies attach and remove should that ever be desired.

I know I need slip rings for the two USB cameras up top. And for the tilt servo as well. I can't use a USB hub up top because both the USB devices can fairly well saturate a USB 2.0 bus. I use the hardware encoded mjpeg from the webcam which helps bandwidth, but I'm going to give an entire USB 2.0 bus to the Kinect.

Wednesday, October 15, 2014

Sliding around... spinning around.

The wiring and electronics for the new omniwheel robot are coming together nicely. Having wired this up using 4 individual stepper controllers, one sees the value in commissioning a custom base board for the stepper drivers to plug into. I still have to connect an IMU to the beast, so precision strafing will (hopefully) be obtainable. The sparkfun mecanum video has the more traditional two wheels each side design, but does wobble a bit when strafing.


Apart from the current requirements the new robot is also really heavy, probably heavier than Terry. I'm still working out what battery to use to meet the high current needs of four reasonable steppers on the move.

Tuesday, September 9, 2014

Getting a feel for Metapolator and Cascading Parameter Values


Metapolator is a new project for working on font families. It allows you to generate many fonts in a family from a few "master" cases. For example, you could have a normal font, modify it to create a rather bold version and then use metapolator to generate 10 shades on the line between normal and bold.

I have recently been reading up on metapolator and how to hack on it. So this post describes my limited understanding of what is a fairly young project. So warnings inline with that are in place; errors are my own, the code is the final arbitrator etc.

Much of the documentation of Metapolator involves the word Meta, so I'm going to drop it off this post as seeing it all the time removes its value in terms of semantic add.

At the core of all of this polating are parameters. For example, after importing a font and calling it "normal" you might assign a value of 100 to xheight. I am assuming that many of the spline points in the glyph (skeleton) can then be defined in terms of this xheight. So the top of the 'n' might be 0.95*xheight.

A system using much the same syntax as Cascading Style Sheets is available to allow parameter values to be set or updated. Because its parameters, its called CPS instead of CSS. So you might select a glyph like 'glyph#n' and then set its xheight to be 105 instead.It seems these selectors go right down to the individual point if that's interesting.

In order to understand the CPS system I decided to start modifying a basic example and trying to get specific values back out of the CPS system. The description of this is mainly to see if my playing around was somewhat along the lines of the intended use of the CPS system.

For this I use a very basic CPS

$ cat /tmp/basic.cps
 
* {
     label : 1234;
     xx    : 5;
}

glyph#y penstroke:i(0) point:i(0) {
     xx    : 6;
}

$ metapolator dev-playground-cps /tmp/basic.cps

The existing dev-playground-cps command makes its own fonts up so all you need is a CPS file that you want to apply to those fonts. In my case I'm using two new properties, the label and 'xx' which are of type string and number respectively.

A default value of 3 is assigned to xx for all points and each point and glyph get a unique label during setup.

I found it insightful to test the below with and without selectors that modify the 'xx' property in the CPS, and at both levels. That is, changing the xx:5 and xx:6 in the above CPS to be xxno1:5 and xxno2:6 and seeing what the below printed out. The xx.value makes the most sense to me, show me the default value (3) if nothing is set in any CPS to override it or show me what the CPS has set if it did any override for the point.

element = controller.query('master#heidi glyph#y penstroke:i(0) point:i(0)')
console.log('element:', element.particulars);
console.log('element:', element.label);
console.log('element:', element.xx);
computed = controller.getComputedStyle(element)
console.log('label: ' + computed.get('label'));
console.log('xx.base   : ' + computed.getCPSValue('xx'));
console.log('xx.updated: ' + computed.getCPS('xx'));
console.log('xx.value  : ' + (computed.getCPS('xx') ? computed.getCPS('xx') : computed.getCPSValue('xx')));


The above code is also pushed to a branch of my mp fork at cps.js#L213

I found that a little tinkering in StyleDict.js was needed to get things to operate how I'd expected which is most likely because I'm using it wrong^tm.
The main thing was changing getCPSValue to test for a local entry for a parameter before using the global default StyleDict.js#L93.

I might look at adding a way to apply a CPS to a named font and showing the resulting font as pretty json. For reference this will likely have value and valuebase showing the possibly CPS updated value and the value from the original font respectively.

Thursday, August 28, 2014

Terry is getting In-Terry-gence.

I had hoped to use a quad core ARM machine running ROS to spruce up Terry the robot. Performing tasks like robotic pick and place, controlling Tiny Tim and autonomous "docking". Unfortunately I found that trying to use a Kinect from an ARM based Linux machine can make for some interesting times. So I thought I'd dig at the ultra low end Intel chipset "SBC". The below is a J1900 Atom machine which can have up to 8Gb of RAM and sports the features that one expects from a contemporary desktop machine, Gb net, USB3, SATA3, and even a PCI-e expansion.


A big draw to this is the "DC" version, which takes a normal laptop style power connector instead of the much larger ATX connectors. This makes it much simpler to hookup to a battery pack for mobile use. The board runs nicely from a laptop extension battery, even if the on button is a but funky looking. On the left is a nice battery pack which is running the whole PC.

An interesting feature of this motherboard is no LED at all. I had sort of gotten used to Intel boards having blinks and power LEDs and the like.
There should be enough CPU grunt to handle the Kinect and start looking at doing DSLAM and thus autonomous navigation.

Monday, July 14, 2014

Hookup wires can connect themselves...

A test of precision movement of the Lynxmotion AL5D robot arm, seeing if it could pluck a hookup wire from a whiteboard and insert it into an Arduino Uno. The result: yes it certainly can! To be able to go from Fritzing layout file to automatic real world jumper setup wires would have to be inserted in a specific ordering so that the gripper could overhang the unwired part of the header as it went along.


Lynxmotion AL5D moving a jumper to an Arduino. from Ben Martin on Vimeo.

Saturday, July 5, 2014

Ending up Small

It's easy to get swept up in trying to build a robot that has autonomy and the proximity, environment detection and inference, and feedback mechanisms that go along with that. There's something to be said about the fun of a direct drive robot with just power control and no feedback. So Tiny Tim was born! For reference, his wheels are 4 inches in diameter.


A Uno is used with an analog joystick shield to drive Tim. He is not intended and probably never will be able to move autonomously. Direct command only. Packets are sent over a wireless link from the controller (5v) to Tim (3v3). Onboard Tim is an 8Mhz/3v3 Pro Micro which I got back on Arduino day :) The motors are driven by a 1A Dual TB6612FNG Motor Driver which is operated very much like an L298 dual hbridge (2 direction pins and a PWM). Tim's Pro Micro also talks to an OLED screen and his wireless board is put out behind the battery to try to isolate it a little from the metal. The OLED screen needed 3v3 signals, so Tim became a 3v3 logic robot.

He is missing a hub mount, so the wheel on the left is just sitting on the mini gearmotor's shaft. At the other end of the channel is a Tamiya Omni Wheel which tilts the body slightly forward. I've put the battery at the back to try to make sure he doesn't flip over during hard break.

A custom PCB would remove most of the wires on Tim and be more robust. But most of Tim is currently put together from random bits that were available. The channel and beams should have a warning letting you know what can happen once you bolt a few of them together ;)

Tuesday, July 1, 2014

The long road to an Autonomous Vehicle.

Some people play tennis, some try to build autonomous wheeled robots. I do the later and the result has come to be known as “Terry”. In the long road toward that goal what started as a direct drive, “give the motor a PWM of X% of the total power”, lead to having encoder feedback so that the wheel could be turned an exact amount over a given time.  This has meant that the control interface no longer uses a direct power and direction slider but has buttons to perform specific motion tasks. The button with the road icon and a 1 will move the wheels forward a single rotation to advance Terry 6π inches forwards (6 inch wheels).

The red tinted buttons use the wheel encoders to provide precision movement. Enc left and Enc right turn Terry in place (one wheel forward, one wheel backwards). The Circus and Oval perform patrol like maneuvers. Circus moves forward, turns in place, returns back, and turns around again. It's like a strafing patrol. On the other hand, the Oval turns move like a regular car, with both wheels going forward but one wheel moving slower than the other to create the turning effect. The pan and tilt control the on board camera for a good Terry eye view of the world.

I added two way web socket communications to Terry this week. So now the main battery voltage is updated and the current, err, current for each motor is shown at the bottom of the page. The two white boxes there give version information so you can see if the data is stale or not. I suspect a little async javascript on some model value will be coming so that items on the page will darken as they become stale due to lag on Terry or communications loss.

I've been researching proximity detection and will be integrating a Kinect for longer distance measures soon. Having good reliable distance measurements from Terry to objects is the first step in getting him to create maps of the environment for autonomous navigation.