Handicam: Optical registration for Handibot?

Handicam software running beside Handibot with attached markerboard and camera.


I use a Handibot for CNC woodwork. If you aren’t familiar, Handibot is a portable CNC router that you place on top of your workpiece to make pre-programmed cuts. It can cut the same designs and perform most of the same tasks as a full sized CNC machine, but compared to full-size CNC machines, it’s compact — small enough to pick up and move around. Since the machine sits on top of a workpiece, the size of project it can tackle is virtually unlimited. The downside of the compact design is that it only cuts 6″x8″ at one time before the operator must physically lift and reposition the machine for the next “tile” of a cut. So for a large project on a 4’x8’ sheet, you may reposition and register the Handibot 96 times.

VCarve displays a large cut with 96 tiles.

Today, when I’m performing multi-part cuts, I use a custom rigid jig to register the position for each cut. These jigs aren’t bad, but they require building a new tool, possibly with access to different machinery, need additional workspace, and physically limit the size of my projects.

For example, my “3 bay jig” limits me to 24″ on the Y axis and requires a 44″ deep work surface. When I use this, I’m not getting all the value I want from a compact CNC machine.

Idea: Optical Registration

I’m going to try an experiment. Can I improve my experience with the Handibot by performing multi-part cuts using computer vision for registration? In short, can I use computer vision to capture a full view of my workpiece (ie: “scan” the workpiece), then identify the target and current position of the machine on the work surface, for each cut, with high precision (+/- 0.01”)? And will precise computer vision-based registration make for a better user experience?

Full disclosure: I chose this project as an opportunity to learn about computer vision and get experience with OpenCV. My goal was not to compare and find an ideal registration method, so I did not consider alternatives. That said, I don’t believe my precision requirements could be met with alternatives including LIDAR, light-based proximity sensors (eg. the popular Sharp proximity sensors), or ultrasonic sensors. If you disagree, and can point me to a good alternative to a camera-based solution, please let me know.

Summarizing Findings

You can use and modify Handicam yourself, and I’ll provide more details on GitHub. For this post, I’m jumping straight to findings.

A clean workpiece.
Masking tape added to workpiece to provide landmarks for image recognition.
Scanning a workpiece using Handicam software. Visible in this image are a custom camera mount and the custom markerboard that’s used to precisely determine camera orientation and physical world dimensions. Also note, the Handibot isn’t yet present.
Registering the Handibot with support from Handicam. Note that the markerboard is now sitting side the Handibot. The Y-Axis of the Handibot must be set about halfway back so the camera’s view of the markerboard and workpiece is not obstructed.
Image of Handicam software. The window on the left shows a whole workpiece with 9 tiles. The window on the right shows the Handibot’s current position relative to the selected tile. The +/- error is calculated by comparing measured offsets over a number of frames.

Good news: We can create an optical registration solution for the Handibot with better than 1/100” precision. Doing so requires some custom parts, including a camera mount and Aruco marker board. It also requires a good camera with control over focus and exposure and reasonably good lighting. The code on GitHub is available as a proof of concept.

Now the bad news: Using Handicam requires manually placing the machine for each tile. Accurately maneuvering the Handibot, which weighs about 50lbs and has an anti-skid rubber bottom, is difficult and cumbersome. So, for example, while Handicam can provide feedback that the Handibot needs to be moved 1/32” on the X-axis, lifting and moving the machine that distance by hand, within a tolerance of 1/100”, takes a lot of frustrating trial and error. In my experience, I could do it, but it took minutes to adjust the position for each tile.

With access to Handicam, I still prefer a rigid jig. A jig gives me reasonable precision and significantly less effort per tile. That said, there are some future work areas that could greatly improve the solution and make it viable:

On demand single-tile CAM, based on the Handibot’s current position: Rather than pre-program a grid of “tiles” for a complete multi-part cut then force the user to accurately position the machine for each tile, generate the cut instructions for any position that the user places the machine, on demand. This way, the user can roughly move the machine to where a cut is needed, without fussing about accurate placement, but still get a precisely aligned cut according to the machine’s measured position on the workpiece.

Autonomous movement for the Handibot: Turn the Handibot into a mobile robot, using wheels, tracks, or belts for accurate repositioning. Since I started work on Handicam, I’ve learned about some new options for CNC routing with autonomous motion. Considering these new options, I think that the Handibot and its tile-based approach for large cuts may still offer some advantages: The weight and anti-skid features that make for tedious manual repositioning between cuts are actually a virtue at cut time, because they ensure the machine stays precisely aligned in spite of large forces on the router that could move the whole machine. Further, the tile-at-a-time approach enables the user to choose to add additional work-holding when needed.

I don’t currently have any concrete plans to pursue these ideas, but if I a tinker more, I’ll try to share. Thanks!

Xaar 128 Printhead Driver

The Xaar 128 is a piezoelectric inkjet printhead used in large format vinyl sign-making. It *might* be useful in 3d printing, conductive ink, or masking applications.

Why piezo? TLDR: Most inkjet printheads are “thermal”: They work by superheating a fraction of the ink in a chamber, turning it into gas, which expands to force the remainder of the ink out of a nozzle. Superheating limits the range of materials that can be used in these printheads. Piezoelectric printheads are less common, and since they use a mechanical operation to force fluid out of a nozzle, they don’t have to modify the state of the fluid to operate, and can work with a broader range of materials.

More details on the Rep Rap wiki.

Starter source code on GitHub.


  • While I planned to try some different materials with the Xaar 128, I started out with the Solvent Ink that it’s built for. I was mostly using used printheads that I could buy inexpensively on eBay, since new Xaar 128s are pretty expensive. Nozzle clogs were a big problem. I had to flush the nozzles every time I sat down to work. This wasn’t really compatible with an after-hours hacking schedule.
  • I used flexible flat cable (FFC or FPC) to connect my board to the printhead. I’ve been burnt by overflexing ribbon cable before, so I thought this was a good idea. But I didn’t properly anchor the connection points. After some use, I started getting erratic behavior and stalling from the printhead. After a lot of debugging, I found that the leads on one end of my cable had overflexed and would break contact at certain points in the movement. Lesson: anchor connection ends so that no flexing happens near the exposed leads.
  • I was never able to consistently push anything more viscous than solvent ink through the printheads. Epoxy or photo curing resin is much more viscous (eg. ~1000-2000+ cp vs ~10-20). This means these heads may be useful for something like depositing a low viscosity binder for powder printing, but probably not for depositing a material that can harden into a solid by itself. I’d love to find a printhead that can.