Baxter is now able to successfully pick up pieces, given their pixel coordinates
https://drive.google.com/open?id=10NgMdBq_DEMdI87KBSZC5wiTLnITuNA2
Baxter is now able to successfully pick up pieces, given their pixel coordinates
https://drive.google.com/open?id=10NgMdBq_DEMdI87KBSZC5wiTLnITuNA2
Project Team: Matthew Ferrara
Advisor: Dr. Seung-yun Kim
Using TCNJs research robot Baxter, the project uses computer vision to recognize checkers pieces, calculate their relative position in 3D space, and allow Baxter to pick/place the piece. This functionality will be integrated with an open-source algorithm for the board game checkers, with the goal of enabling Baxter to play the game without any human intervention besides initial startup.
The project uses image recognition libraries to extract useful information (features) from Baxter’s head and hand cameras. These features with then be transformed into 3D coordinates relative to Baxter’s position. From there, inverse kinematic (IK) equations will be used to position Baxter’s joints to properly pick/place the identified piece.