10 Jul 2025
Instructions for each group are marked in red below. For Future Innovators, there are multiple tracks that you should work on concurrently; you won’t make it if you’re only doing one thing at a time.
Google meet link I’ll be in here from 3pm to 5.30pm. Hop in if you need to ask me anything.
WRO 2025
- Future Engineer
- There are 2 challenges (you’ll need to do both); I suggest you start with the open challenge (ie. no obstacles).
- Don’t need camera for open challenge.
- Can perform simple wall following using the ultrasonic you’ve installed.
- Proportional control just like line / gyro following.
- Calculate error (difference between actual and desired distance) and correction.
- Sensor must be forward of the center of rotation to avoid positive feedback.
- Optionally improve stability by combining gyro and wall following.
- Read rules
- Build your robot (MUST be 4 wheels ackerman. No other drive mechanisms are allowed.)
- Ultrasonic and Camera
- Camera
- OpenMV. Start by looking at color blob detection. Send results to main controller using pupremote
- Ultrasonic
- Detecting walls. Know when to turn and which way to turn.
- Alternative is a LiDAR (provides more data points). Higher precision. Can differentiate between small objects (like obstacles) and large objects (like walls).
- Parts list. Send to your teacher. I’ll try to get for you what I can.
- Steering mechanism easiest to build using Lego. Differential (optional. Can run without it, but wheels will skid) easiest to build using lego.
- Alternative is to use a pre-built RC car as a base.
- There are 2 challenges (you’ll need to do both); I suggest you start with the open challenge (ie. no obstacles).
- Robomission
- Missions
- Target to complete your first mission today (ie. pick up and deposite both balls).
- Be extremely consistent with your starting position. Especially angle, which should be near perfect (<1 degree error).
- Align your robot close to your target and in both X and Y direction, before interacting with the target. (eg. before opening the box, align to both a horizontal and vertical black line).
- List of functions to prepare
- Turn Left / Right: These should turn your robot quickly and accurately to the target direction
- Fwd cm: Move your robot forward for the given distance. It should use the gyro to keep your robot moving straight.
- Rev cm: Same as above, but for reverse.
- Fwd until black / white: Move forward while using the gyro to keep your robot straight. Stops when the color sensor sees black / white.
- Rev until black / white: Same as above, but for reverse.
- If you have access to the Spike Prime robot, implement these functions using the robot. Else, implement it in GearsBot.
- Missions
- Future Innovator
- Drone guided garbage collector
- Two tracks to work on concurrently…
- Track 1: Mount your OpenMV cam, ESP32, and power bank on the PVC pipe. Make sure the OpenMV cam is looking straight down (ie. not tilted). Check that the OpenMV cam can recognise an apriltag on the floor; so far, we’ve only tested the apriltag at a short distance, so we need to make sure it can still work at the distance when mounted on the pole. If it’s not able to, you can either increase camera resolution to VGA (currently at QVGA) at the cost of much lower frame rate (~1 fps), or reduce the pole height at the cost of seeing a smaller area.
- Track 2: Send the command via ESPNow to the robot. Code the robot to receive the commands and act accordingly.
- Documentation for AprilTag in OpenMV : The OpenMV IDE contains example code that you can use for an easy start.
- Use this to generate AprilTags for printing
- AprilTag official website
- Nice summary of how AprilTags work
- TODO:
- Build support for camera.
- Robot; need to add garbage storage area, AprilTag (…may need a motorized cover)
- Collect images of garbage
- Tissue
- Cans. Can be difficult, as they look very different depending on the labels and orientation.
- Cups. Easier as these cups are often clear or white (eg. starbucks, bubble tea, mcdonalds).
- Collect lots of images (>100). In different orientation and background. Different types (eg. bubble tea cups from different stores).
- Alternatively, train with just one specific type of can/cup, and make sure to use that exact can/cup during demo.
- Training
- Parts delivery on 17 Jun
- OpenMV H7 Plus: This one can do AprilTag detection at QVGA resolution. You’ll need to merge this with the garbage detection. Start with using blob detection for garbage (…easier than object detection, and there are example code in the OpenMV IDE), and combine blob detection and AprilTag detection in one program.
- PVC Pipe 90 degrees connector: Use this with the PVC pipes to build a mount for the camera.
- Emotional Support Robot
- Three tracks to work on concurrently…
- Track 1: Complete I-Spy game. Code the interaction between the CYD and ESP32Cam (ie. send command from CYD to ESP32Cam, and send result (list of 4 words) from ESP32Cam to CYD). Code the interaction between the CYD and normal ESP32 (detect touch).
- Track 2: Sew touch sensing wires. Your wires should ideally be bare strands on one end, and insulated on the other end (…so that we can crimp on a connector). When sewing, make sure that there are no uninsulated wires dangling inside the robot.
- Track 3: Install arm. I have the new parts laser cut, but can’t send them to you today. If you’re able to, you should start exploring ways to mount the arm. We want to complete mounting the arms by the next session.
- Object recognition using Google’s Vision API
- Sample code
- When sending the request using curl, use this
curl -X POST -H "X-goog-api-key: AIzaSyD8a2Ihqd-_3TPl1PBUb7gZDWJnnc179Tc" -H "Content-Type: application/json; charset=utf-8" -d @request.json "https://vision.googleapis.com/v1/images:annotate"
- Text-to-Speech: If using a microcontroller, you’ll need hardware for this, either https://vi.aliexpress.com/item/1005008596064970.html or https://vi.aliexpress.com/item/1005008572550921.html
- Text-to-Speech alternative: If using a computer, you can do text-to-speech on the computer itself with no added hardware.
- Convert images to suitable format for CYD
- Reading materials for soft robot
- Soft arm
- Review this design and improve on it.
- How to mount to the “spine” of the robot
- I-Spy
- Done in text
- Integrate with CYD. ESP32Cam will transmit the four options to CYD via wireless (ESPNow). CYD displays options and detect presses.
- Integrate with MP3 module VS1003.
- Emotion detection
- Trigger detection with pet on head
- Actions: Sad (display/say something (eg. come here for a hug), detect when child hugs, then hug the child)
- Prepare stock photos
- Touch detection
- Copper threads.
- Parts delivery on 17 Jun
- MP3 Player module (VS1003): There’s an extension for this in IoTy. Use blocks to generate sample Python code that you can incorporate into your main code. The blocks will also indicate the wiring connections. Record short MP3 files (…low bitrate such as 32kbps is recommended) and transfer to the ESP32 using the “Files on device…” option under the kebab menu button.
- Motors with mounting brackets x2: Together with the motor you already have, you should now be able to build 3 arms. The third arm is a spare / demo set (judges can’t see the arm in the soft toy).
- Motor flange x2: For attaching string to motor.
- Laser cut wood (3 sets): For building the arm.
- String: For building the arm.
- Speaker: For MP3 player.
- Snail Eliminator
- Three tracks to work on concurrently…
- Track 1: Complete spraying mechanism. Make sure servo can trigger the spray. Mount it on the robot, pointing sideways and targeting a point around a meter away (…your robot won’t get very close to the snail).
- Track 2: Get your robot to move. Finish the connection between the esp32 and the h-bridge motor driver, and test moving and turning. Be very careful not to connect the batteries in reverse; it will destroy the motor driver instantly.
- Track 3: Perform your snail detection training (…see below for instructions).
- Detecting a snail
- Robot
- Assemble.
- Wire up your motor driver. Control motor driver.
- Mountings for camera
- Mountings for sprayer
- Sprayer
- Build trigger mechanism
- Image training
- Collected 100 images
- Labelled images
- Do training
- Test (…need openmv cam)
- Possible to cheat a bit… Train on only one snail image (…from multiple background). And use that exact snail image for demo.
- Parts delivery on 17 Jun
- Long hex adapters for connecting motors to wheels
- Three tracks to work on concurrently…
- Birds Protector
- Three tracks to work on concurrently…
- Track 1: Research. Get a better understanding of what the data from the sensors mean. At what point will you sound the alarm?
- Track 2: Alarm. Connect the siren, lights, and servo to the ESP32. The siren and light should be connected via the MOSFET board. The servo can be connected directly. Test and make sure you know how to use them.
- Track 3: Packaging. Your parts can’t be just dangling around. Find a way to package them neatly. Final product should be a single box with all parts and batteries inside.
- PMS7003. Air Particle Concentration Sensor. Need to figure out how to inteprete the data from sensor. And decide at what level you need to sound warning.
- AGS10. Total Volatile Organic Compound sensor. Optional. You can use it if you want.
- Figure out a way to test the sensor. Somehow make a sample of air with poor air quality.
- Alarm
- I can provide a siren and servo motors.
- Think through how you can scare off birds
- Parts delivery on 17 Jun
- Particle Concentration Sensor (PMS7003): There’s an extension for this in IoTy. Use blocks to generate sample Python code that you can incorporate into your main code. See here for pinout https://files.aposteriori.com.sg/get/sK3uM5AZF8.webp. Be very careful not to make an incorrect connection, as it may damage the sensor!
- Volatile Organic Compound sensor (AGS10): Another sensor for air quality. You don’t need to use this, but it’s available to you if your research indicates that VOC is relevant to your project. Like the PMS7003, there’s an extension for this in IoTy and you can use blocks to generate sample Python code. Note that it takes a few minutes for reading to stabilize.
- Servo driver (PCA9685): Use this to control the servo. You can find documentations here. Note that the servo should not exceed 2×18650 batteries (nominally 7.4V).
- MG996 Servo x2: You can use this to make a waving arm for scaring off birds.
- ESP32 with adapter board: Program this to read the sensors and control the parts.
- Flashing light and siren: These are designed for 12V power, but may work on lower voltages (ie. the 7.4V from 2×18650 batteries).
- MOSFET board: The ESP32 can’t directly turn on the lights/siren. This acts as an electronic switch to turn the lights/siren on or off. VIN connects to batteries. OUT connects to lights/siren. GND to GND of ESP32. TRIG to any of the output capable GPIO pins on the ESP32. Note that GND of the ESP32 and the GND of the battery must be common. Be very careful not to connect VIN in reverse; this will likely destroy the board instantly.
- Three tracks to work on concurrently…
- Drone guided garbage collector
Sec 1
- Please try this tutorial
- Pybricks Documentation
CoSpace
- For beginners
- Basic tutorial on C coding http://learn-c.org (Need only do “Learn the Basics”)
- Design Patterns (ODP / PDF)
- Handling Durations (ODP / PDF)
- Setup compilation in vscode
- Open up your cospace C file in vscode.
- Press “Ctrl – Shift – b”. You should see a message “No build task to run found. Configure build task”. Click on that message.
- Click on “Create tasks.json file from template”
- Click on “Others”. This should open a new “tasks.json” file.
- Delete everything in you “tasks.json” and replace it with the content of this link. Save your “tasks.json” (Ctrl – s).
- Go back to your C file, and press “Ctrl – Shift – b”. Your file should now be compiled into a dll.
Robocup OnStage
- Detecting the balls
- OpenCV (Computer vision library). You’ll need to install this on a laptop. Python version is recommended.
- Complete color blob detection tutorial.
- Sending commands
- When sending data from an OpenMV Cam to an ESP32, the easiest way would be to use a UART connection.
- To wire the UART connection, connect the TX pin of one device to the RX pin of the other device (…and vice versa). The GND pin of one device should also be connected to the GND pin of the other device.
- OpenMV Cam. Read the docs to learn how to send data. You can find the TX and RX pin position here (…use UART3. Avoid UART1). When sending data, it’s often best to use string format. Be sure to add a ‘\n’ at the end of the line.
- ESP32. Read this page on initializing the UART and the TX RX pin numbers. And this page on how to readline. Do not use UART0 (…UART2 is fine, and you can use the default pins for that).
Robocup Rescue Line (OpenMV Cam)
- If you are using the OpenMV Cam, you’ll need to…
- Detect the black line and green box. Install the OpenMV IDE. Open the IDE and load the multi-color blob tracking example. Try it out, and tune it to detect the black line.
- When tuning, set the preview to LAB mode, select a region that you’re interested in (eg. black line), and look at the histogram to determine a suitable threshold. Note that in LAB color space, the L represents brightness, and should be low for black.
- By default, the example uses the entire frame for detection. That’s probably too large. Use the crop function to reduce the detection area.
- If you’re using the spike prime, you should use PUPremote to communicate between the OpenMV cam and Spike Prime. Download the python files from this github page. pupremote.py needs to be uploaded to your openmv cam, while pupremote_hub.py needs to be uploaded to your spike prime (via pybricks software).
- Other examples.
- OpenMV IDE contains an example of a line following code (Image Processing -> Black grayscale line following).
- Antons Mindstorm provides another example of line following code here.
- Neither are recommended, because… 1) You should write your own code. 2) Both of these are inadequate for robocup; you’ll need to modify them, so it’s better to start with simple code that you can understand well.
- But feel free to read through and understand how these examples work.
Robocup Rescue Line (EV3)
- Tutorial on different methods of reading from an OpenMV Cam or ESP32 without blocking
- Use this when you have an EV3 or PC reading data from the serial port.
- If the reader is micropython device (eg. ESP32), UART read is non-blocking by default.
- Docs for line sensor
Robocup Rescue Line (Generic stuff)
These slides are old, so the sample code are based on the old EV3 software. You won’t be able to use them directly, but the concepts and approach remains the same.
- Briefing Slides (ODP / PDF)
- Single Line Follower (ODP / PDF)
- Double Sensors Line Following (ODP / PDF)
- Ending the Loop (ODP / PDF)
- Turn on Green (ODP / PDF)
- Obstacles Avoidance (ODP / PDF)
- Handling Inclines (ODP / PDF)
- Tiles package for printing on A3 paper
- Video of a common and successful design
- Video of an unconventional design
- Difficult Lines Videos
IoTy
IoTy is a platform for programming the ESP32 using blocks or Python. This is useful for OnStage, Robocup Rescue Line (…if you’re building non-Lego robots), and for general electronics projects (eg. for WRO open category).
- Link to IoTy
- Intro to ESP32 and IoTy
- Connecting
- Working with Breadboards
- Analog Output
- Digital Input
- Ultrasonic Sensor
- Neopixel
- HSV
- IoT with IoTy