Programmed a collaborative robot arm to physically play chess against a human opponent. The system integrates computer vision for live board state detection, motion planning for safe piece manipulation, and real-time communication with a custom chess AI engine.
Describe the project context — what cobot platform did you use, what was the goal, and what made this technically interesting? Explain how the physical and software sides came together.
Replace this placeholder text with real project details when you're ready to publish.
Cobot arm playing chess — [Add caption]
Describe how the board state was detected — camera setup, what library or approach you used (OpenCV, etc.), how pieces were identified and tracked, and what made board detection challenging (lighting, piece occlusion, etc.).
Describe how move commands from the chess engine were converted into robot trajectories. How did you handle piece pickup, placement, and capture? What safety considerations were needed for operating around a human player?
Describe the interface between the cobot controller and the chess AI engine. How did the two systems communicate? What happened when the human made a move — how did the system detect it and trigger the AI response?
See also: Chess AI Engine project →
Full system in operation — human vs. cobot