A Raspbuggy simulation layer

Image
Description

Raspbuggy http://cmcrobotics.github.io/raspbuggy ) is an educational platform used to initiate children to programming and automation.

It is presented in the form of a wifi-enabled RC car that ships with its own (Google Blockly-based) programming environment.

The idea is to write an abstraction layer that renders Raspbuggy behaviour in a 2D (canvas) or 3D-based view to simulate the results of programming logic.

Raspbuggy is based on the Raspberry Pi platform, and allows the integration of a large number of hardware sensors, camera support and software packages.

Currently, two hardware abstraction layers are implemented for Raspbuggy : Pimoroni Explorer and Lego Mindstorm NXT, but there is currently no way for Raspbuggy users to first simulate their code before running it.

Objective : write an abstraction layer that renders Raspbuggy behaviour in a 2D (canvas) or 3D-based view to simulate:

  • Motor movement control (forward / backward / right and left motors / angle-based movement)
  • Obstacle detection
  • Reflectivity reading (light / dark)
  • Pen support
  • Lower / Raise the pen) : attaching a pen to the Raspbuggy, allows it to act like a real-life Logo environment https://en.wikipedia.org/wiki/Logo_(programming_language)
Goals of the project
  • Implement a Canvas / WebGL abstraction layer that can render Raspbuggy block logic.
  • Provide support for in-browser rendering of  :
    • Motor movement control (forward / backward / right and left motors / angle-based movement)
    • Obstacle detection (ultrasonic sensor)
    • Reflectivity reading (light / dark)
    • Pen support (lower / raise)
Skills being sought
  • Essential : fluency in Javascript / Canvas / WebGL
  • Essential : Python basics
  • A plus : Familiarity with Google Blockly Custom Block creation and code generation concepts
  • A plus : Awareness of Brython (a.k.a. using Python as a replacement for Javascript)
Contacts
Brice Copy
James Devine