(Upcoming) Letting AI Move: Robotics Demos Powered by Python (PyConDE Darmstadt 2026)

  • 26 Mar, 2026
  • read

Artificial intelligence can be difficult to explain, especially to people outside of tech. We often rely on slides, diagrams, or on-screen demos, but the real impact does not always stick. For people encountering AI for the first time—such as students or non-technical audiences—AI terminology and concepts can remain abstract and disconnected from real-world experience.

Robots can help change that. When AI controls a physical system, its behavior becomes visible, tangible, and easier to reason about. In this talk, we explore how Python and playful robotics experiments can be used to make AI more concrete, interactive, and engaging. Using the Hugging Face Reachy Mini robot as a case study, we show how physical interaction can turn abstract AI concepts into intuitive, memorable experiences.

The perspective of this talk is intentionally non-traditional: we started with no prior knowledge of robotics or mechanics and approached the problem purely from a Python developer’s point of view. This journey strongly shapes the talk. Rather than focusing on advanced robotics engineering, the emphasis is on accessibility, experimentation, and learning by doing. The goal is to show that robotics can be an approachable medium for explaining AI, even for people without a hardware or engineering background.

During the talk, we walk through basic building blocks such as movement, gestures, and simple interaction patterns, and show how AI-driven behavior can be layered on top of them using familiar Python tools. We share examples from real experiments and demos, including what worked well, what failed, and what we learned from unexpected behavior in live settings.