Student Projects

cancel
Showing results for 
Search instead for 
Did you mean: 

Deep Red - Checkers Playing Robotic Arm (Olin College)

Contact Information

University: Franklin W. Olin College of Engineering

Team Member(s): Matt Alvarado, Scott Thompson, Jessie Rucker, Kate Swift-Spong, John Harley

Faculty Advisors: Professor Dave Barrett

EmailAddress: Matthew.Alvarado@students.olin.edu,Scott.Thomson@students.olin.edu, Jessica.Rucker@students.olin.edu,Katelyn.Swift-Spong@students.olin.edu, John.Harley@students.olin.edu

Project Information

Title: Checkers Playing Robotic Arm
Description:

DeepRed is a checkers playing arm built by a team of undergraduates from Olin College of Engineering. The project uses a ST Robotics five axis arm and controller, Point Grey FireFly camera, NI DAQ, robotic gripper hand, and LabVIEW to play a full game of checkers.

Products:

LabVIEW 2010

NI USB-6501 DAQ

LabVIEW Vision Tookit

The Challenge:

The goal of the project was to take a ST R17 5 Axis Articulated Robotic Arm and have it win a full legal game of checkers against our Professor.To meet this requirement the system needed a sensing framework designed to incorporate both a vision system and sensor suite. This would allow the system to detect changes board configurations as well as other objects such as hands and arms. To process this environmental data a thinking component was also necessary. This would allow the system to preform the appropriate complex algorithms required to choose optimal moves. Lastly an actuation component was required to communicate with the arm’s controller box to send the appropriate movement commands.

The Solution:


Current System:

The design of Deep Red was based off of sense, think, act paradigm. This architecture fits perfectly into the dataflow model LabVIEW is based upon as well as the serial gameplay of checkers.

checkers arm.PNG

  • Sense:

    

    • Vision System:

Deep Red’s sensory platform is based around a Point Grey FireFly camera. The camera is mounted above the checkers board so it has a clear birds eye view of the board. The camera is primarily used to detect the checkers board as well as pieces on the board. The vision algorithm uses the lines of the board to auto-calibrate the camera and detect foreign objects in the playing field. For safety purposes, if a foreign object is detected Deep Red will not proceed to the think or act stages.

camera.PNG


  • Think:
    • Determine Possible Moves:

              To perform the majority of higher level processing Deep Red uses recursion to calculate all possible moves for a given board configuration. This is done using a recursive algorithm writtenin LabVIEW.

   

    • MiniMax Algorithm:

Deep Red uses a recursive MiniMax algorithm written in LabVIEW to determinethe best move to make. The algorithm uses a board ranking heuristic to calculate move strengths and selects the best possible move for Deep Red. Because our system runs on a laptop the depth we could recurse was limited. The final version of our code looks ahead 6 moves to find the optimal move.

    • Local Memory:

              Deep Red stores the previous layout of the board and compares it with the current layout of the board. This local memory allows Deep Red to check for legal moves as well as identify whether or not its opponent has acquired a king.

  • Act:
    • Arm Actuation:

To manipulate checkers pieces Deep Red uses a 5 axis ST Robotic arm. A serial command is sent via LabVIEW to the arms controller box which is responsible for actuating the stepper motors used to control the arm.The arm uses a dead reckoning system to

keep track of its position.

controller box.PNG



    • Gripper Actuation:

Deep Red is equipped with a robotic gripper constructed on Olin’s rapid prototyping machine. It uses a linear stepper motor to close the three fingers around a checker piece allowing it to move the piece. The Gripper is actuated by a NI USB-6501 DAQ which sends digital commands to a RMS R208 stepper motor driver.

IMG_0627.jpg

gripper actuation.PNG

    • Emergency stop

               The system is equipped with an emergency stop in case Deep Red malfunctions. The button freezes the arm but does not cut power so it does not fall and crash into the table.

               e stop.PNG

Future Development

DeepRed has many areas for future improvement. We did add on severalpPassive IR sensors to determine if a person is interfering with the board but they have yet to be integrated into the code. Additional sensors such as small sonars could also be mounted around Deep Red to add to its environmental awareness. Another improvement could involve increasing computation speed. This could be done by porting our code toa FPGA on a NI cRIO. This would hardware accelerate our code and allow us to search more moves ahead. The last major improvement would be to integrate LabVIEW’s kinematic VI’s from the robotics module into our code and actuate the arm ourselves.  Currently we rely on the controller unit shipped with the arm which does not allow us to interface with the arm when it is moving. This is not ideal if we want e-stop the arm in software when the system detects a person entering the board.

Team Photo:

team.JPG

Nominate Your Professor:

Professor David Barrett has pioneered the use of LabView and other NI technologies at the Franklin W. Olin College of Engineering. Professor Barrett teaches Olin’s Robotics I and Robotics II courses which a use LabVIEW an corresponding National Instruments hardware as a integratool in rapid robotics development. In addition he served as the Director of the Olin College SCOPE program and encouraged the use of National Instruments technologies for large scale robotics projects. His passion for National Instruments products has created an undergraduate robotics community which has achieved astounding things.

Comments
LPS
NI Employee (retired)
on

Hello there,

Thank you so much for your project submission into the NI LabVIEW Student Design Competition. It's great to see your enthusiasm for NI LabVIEW! Make sure you share your project URL(https://decibel.ni.com/content/docs/DOC-16513) with your peers and faculty so you can collect votes for your project and win. Collecting the most "likes" gives you the opportunity to win cash prizes for your project submission. If you or your friends have any questions about how to go about "voting" for your project, tell them to read this brief document (https://decibel.ni.com/content/docs/DOC-16409).

I'm curious to know, what's your favorite part about using LabVIEW and how did you hear about the competition? Great work!!

Good Luck, Liz in Austin, TX.

Contributors