Pick and Place Project

Objective
To develop a low cost, reproducible, open-source pick and place system for automated SMT electronics assembly.

Progress so far:
Successfully controlling a cartesian table through EMC2 via a Python script on a remote machine.
Preliminary on-paper design of a variable-width T&R feeder
(mostly-) fabricated vacuum / rotation / optics head
Vacuum pump (hacked aquarium pump) works for 0603/0805 without suction cups
Some optical tests: image segmentation (determine what parts of the image constitute a “part”), and some optical derotating schemes.

Requirements and “nice to haves”
Cartesian table

--Standard 3axis CNC-like machine that can accept the required head
--Realtime motion controller that can be driven by described pick-n-place software (EMC with emcrsh/halrmt + python via telnetlib)

Optical system

--USB camera following a grabbable standard (UVC, WIA?)
----Software-controlled focus? Had a brief look and couldn't find any in size, cost range of interest. WB-7142 ...2MPx optical webcam with autofocus. Too good to be true? Also found the "Cubeternet" which mounts easily via 4 mounting holes once the case is removed :-) Also, word on the street is that auto/software-adjustable (e.g. voicecoil) focus is unreliable, especially in motion and when the camera is not facing horizontally.
----Fixed (user-set) focus -- probably set pretty near
----Active lighting - LED source
------May need multiple, switchable angles to show partnumbers clearly for feature matching
----Calibration table (paper/printout)
----SURF for discrete feature detection Processing implementation did not work well in my tests; loves micro-features (paper grain) and ignores macro features, e.g. part legs or corners of text
------Abstracted interface of (points / transformation matrix / stats); hoping to simplify switches to new feature detection engines due to improvements and/or patent encumberances
----Stitching for 'big picture' of close-cam images for possible fiducial finding, board auto-array
------Hugin commandline / autopano provides this

Placement Head

--Suction Needle
----On/off switching
----(optional) positive pressure capability to ensure part releases - needed?
--180deg rotator
--Camera
--Bump Detect

Paste Dispense Head

--Needed? Or is everyone using stencils?

Glue Dispense Head

--Needed? Probably not a priority for hobby circuits; KISS (keep it single sided).

Feeders:

--Reel
----Accept T&R of any width/depth (8mm+); standard and (eventually) mega (right-side elongated hole)
----Advance tape a computer-controlled amount per placement
------May not be terribly accurate; this is (part of) why the camera is needed
----Pull and dispose (spool or eject) tape cover; dispose empty tape
----(optional) Attempt to detect feed pitch and/or mispicks by detecting light through part wells?
--Tube
----Accept tubes of all reasonable dimensions (there must be a standard; what/where is it?)
----Liberate parts from tube in a controlled-ish manner:
------Long screw to poke them out from the other side?
------Lift one end and vibrate them out; use 'loose' method to pick??
------'Catcher' endstop positioned exactly matching the length of one part?
--Tray
----No special hardware; tray support will probably be a software task
--Loose?
----Match centering and rotation to a known reference (hand-placed and autophoto'ed?)
----(optional) Detect parts beneath preset 'matchedness' threshold and avoid/warn (upside down or wrong part?)
----(optional) Attempt to flip upside-down parts by dropping repeatedly (may require enclosed 'drop tank' and cushioning; not suitable for all parts)

Comments

10 responses to “Pick and Place Project”

  1. John Avatar
    John

    Can you provide info on how you control EMC from a script? I too am working on a pick and place robot, and I was hoping to leverage EMC2, but I couldn’t find any documentation on how to do that. Where did you find info on how to provide commands to EMC2?

    Here are a couple of links for people considering similar pick and place machines, but some are purchasing software. I’d prefer to make the software like you are.

    http://www.cnczone.com/forums/showthread.php?t=97551
    http://www.vonnieda.org/openpnp

    I had one post with my ideas for how I would do the vision, but they are just ideas. http://www.cnczone.com/forums/showpost.php?p=769682&postcount=131

    Is it possible to control EMC from the same computer, or do you specifically desire remote access?

    Thanks for the info ahead of time!

  2. Tim Avatar

    @John: Cool! Sounds like there are a few people tinkering with PnP lately, just not together :-) I’d be happy to jump into an existing PnP project, although I don’t have that much time to work on it at the moment. Right now I’m concentrating on a mass-self-produceable cut tape feeder – after the machine itself, that’s the big expense of the commercial machines.

    Here is what I found for driving EMC2 from an external program:

    http://wiki.linuxcnc.org/emcinfo.pl?Emcrsh
    http://wiki.linuxcnc.org/emcinfo.pl?Halrmt

    Emcrsh is a remote shell (Telnet server, essentially) that allows some control over the running emc2 instance. Most notably it exposes the MDI (manual data input) interface, so gcode can be generated and sent to the machine on the fly. I chose this route since gcode is easy and portable, and EMC2 works just as well with non-trivial kinematics machines (SCARA arms, elephant trunk bots) with no code changes on the PnP controller end. Best of all, it works just as well on the same machine or over a network. “Remote” access (i.e. from a separate PC) is not a firm requirement for me, but it adds a bit of flexibility, e.g. to put the CPU-hungry vision processing stuff (if/when supported) on a separate machine from the CPU-hungry realtime controller, or running the PnP from another distro or win32 out of convenience or necessity (OpenCV2.x / pyopencv are not available on a supported EMC2 configuration, namely Ubuntu 8.10LTS due to dependency problems).

    Here is what I did for a “hello world” emcrsh test. Note, I have not tested halrmt yet. Halrmt is a similar interface to the HAL, which could be useful for accessing special hardware like tape feeders (although defining custom M-codes might be a better longterm approach):

    In your EMC2 machine’s .hal configuration, add:

    loadusr halrmt
    loadusr emcrsh

    On the controlling machine, make sure python and its telnetlib are installed. My test script was:

    import glob, os, sys, time
    import telnetlib

    EMC_HOST = “razor”
    EMC_RSH_PORT = 5007
    EMC_HAL_PORT = 5006
    timeout = 3

    def main(argv):

    emc = telnetlib.Telnet(EMC_HOST, EMC_RSH_PORT, timeout)
    emc.set_debuglevel(9999)
    #emc.open()

    # should wait a moment here for emc to spit out any messages or newlines (if any), although it’s surprisingly un-verbose…
    # HACK HACK HACK: emcrsh (sometimes? only across hosts? only in windows?) ignores the first line sent. Send a dummy line as the first line.
    # a properly-behaving emcrsh “should” ignore (or echo) the unrecognized line
    # also, it is requiring \r\n line termination despite being on linux. wtf?
    time.sleep(1.0)
    emc.write(“Dummy line, please ignore (bug fix)\r\n”)
    time.sleep(1.0)
    emc.write(“hello EMC 1 1\r\n”) # FIXME: use password var
    emcResponse = emc.read_until(“\n”, timeout)

    emc.write(“set verbose on\r\n”)
    # required password to enable most SET commands, thanks newsgroup groveling
    emc.write(“set enable EMCTOO”) # FIXME: use password var here too
    emc.write(“set mode mdi”)

    # now you can send MDI commands here, assuming the machine is in a runnable state (homed, etc.)
    emc.write(“set mdi G00 X1 Y0 Z0”)
    print emcResponse
    emc.write(“quit\n”)
    print emc.read_eager()
    emc.close()

    if __name__ == “__main__”:
    main(sys.argv)

  3. Tim Avatar

    I also tried a simple rotational alignment algorithm. Basically the idea is to take the photo of the part, rotate the photo in small increments (modern videocard provides this service for free :-) , edge-detect the resulting image, then sum the pixel values along each row/column to produce a row and column sum vector. Finally, nonlinearly gain (e.g. square) each element in the row/col sums and sum the results to produce a single ‘alignedness’ value. The theory is that the “hot” edge-detected pixels in a correctly-aligned part will lie mostly along vertical and horizontal lines and thus contribute more to the sum when so aligned than when those same edges are spread across multiple rows/cols.

    My quick-n-dirty test of this “worked” (very poor alignment produces a large difference), but the difference between “aligned” and “almost aligned” (1-2 degree skew) was not measurable. Still some more work to be done…!

  4. DVandervort Avatar

    I keep running into both of you ;)

    Concerning the software aspect. Mine is done in Windows and communicates over USB, but if you want to use EMC my software could be easily manipulated to output g-code. It esentially does right now, it just doesn’t output an ascii file.

    Regards,
    Darren Vandervort

  5. John Avatar
    John

    Tim,
    Thank you very much for the info. I’ll try it out soon.

    John

  6. Owen Avatar
    Owen

    Many thanks for alerting me to emcrsh – it’s exactly what I needed.

    I’ve got a simple demo of the machine running up on vimeo, just append 13891466 to the address.

  7. Tim Avatar

    @Owen: That is awesome!

  8. […] at Outguessing the machine has started an open source SMD pick and place project.  So far he has a head design, some scripts for EMC2 and some preliminary optical testing done. […]

  9. Tom Smith Avatar
    Tom Smith

    You will be able to get SURF to work better if you filter the images to remove high frequency noise (eg paper texture). Doing so should not effect the location accuracy of your component.

  10. Tom Smith Avatar
    Tom Smith

    That is assuming that your paper texture is significantly higher frequency then the features you want to detect of course.

    It might be worth collecting a set of test images of components in strip, and on needle, and posting them online for people to play with.

Leave a Reply

Your email address will not be published. Required fields are marked *