Here are short descriptions of a couple of projects I have been working on recently. See my github for the source code.
In Spring 2016, I assembled a robotic scorpion using the Robotis Bioloid Premium kit, and I wrote some code to make it move around. The purpose was mainly demonstration, to persuade Professor Heni Ben Amor that he should allow me to join his Interactive Robotics Lab at ASU. A video I made showing various modes of locomotion is here.
GENERATIVE ADVERSARIAL NETWORKS
In Fall 2017, I collaborated with Dianne Hansford at ASU on an independent study project involving Generative Adversarial Networks (GANs). This involved hacking some preexisting tensorflow code that used a generative adversarial network to create images of cars, using the Stanford Cars Dataset as input, and then using the created car images to generate 3D point clouds. My final report is here.
Also in Fall 2017, as part of my coursework for CSE691 Optimization, I drafted a critique of Radford et al.’s paper introducing the “DCGAN” network. I used an implementation of DCGAN to create the car images for the above independent study. My critique for CSE691 is here and my final project presentation is here.
As part of our final project for CSE591 Perception in Robotics at ASU, my team programmed a TurtleBot to detect and track a human face, and then follow the face as it moved dynamically around a room. Our final report can be found here. A video showing the TurtleBot in action can be found here.