
We won first place!
We are so proud to have been awarded 1st place for Outstanding Registered Students' Organization Exhibit!
Scroll down to see our computer-vision-powered robot in action...

Latest news!
We are to be be placed at the Starbucks area of Sidney Lu MEB for Engineering Open House!
Our team is ecstatic to be placed at such high profile location. Hope to see you guys there!!
About the project
This is a project funded by Society for Engineering Mechanics, and RSO for which I am a project lead.
R2D2 V2.0 aims to accomplish more than its predecessor who is a stationary robot that shakes hands on voice command:
R2V2 moves around with differing speed and direction depending on the user's hand gestures, therefore letting the user (who are middle and high school students who visits Engineering Open House) possess the ability to seemingly control the robot with The Force.
Archived weekly blog
Week 1

R2 needs, as any other friendly neighborhood robot, to do 2 things:
Think: recognize the distance between the user’s fingers,
Then Act: adjust his speed according to said distance.
Let’s tackle the thinking part first: How does he know what the user is doing with their hands?
OpenCV
I used OpenCV and MediaPipe, write a Pycharm program so that the laptop webcam identify whether a hand is in view
Nuances
Of course there are little nuances that were added so things move smoothly, like adding a feature that calculate how much of the frame the hand is taking up, this way if it is too far away or too close to the screen, the computer wont register the hand;
Thumb and Index
Then I calculated the distance between the tip of the thumb and the tip of the index finger (MediaPipe was very helpful in figuring out where the two fingertips are, you can use either your left or right hand!)
Tested
I tested the effectiveness of the program by using it to control the volume of my laptop and ran profusely to the MEL to show my clubmate what ive done
Week 2 + 3

Now how do I control an actual moving system with what I’ve got?
ME 461
Luckily I was taking a class: ME 446 Intro to robotics, where we have a segway robot controlled by a TI board. I happen to also have the best TA in the world who took time to show me how to connect a Raspberry Pi to the segbot.
TCP
I used a TCP program to send the data from my laptop to the robot wirelessly. And modified my code so that the segbot take the value and set it as its speed
It worked!
Week 4

This brings us to the near present.
We need to recreate the same process for the R2 system that we have
Researeach
My team of three got down to doing research on appropriate motors and hardware
Rasp Pi
We set up our own Raspberry Pi and writing necessary programs
Purchase
We put in a purchase form this week on all the parts we need
Motor testing
Working on making a test platform to stress test the motors
Join our team
Here is a link to our Discord channel. Feel free to join mine or our other SEM project lead in our projects!