I integrated the Intel RealSense SDK and found ways to integrate the features into the user experience. For example, in the inital attract screen, when a person walked into the view of the camera, the robot guide would immediately react and start flying circles around the user to grab their attention.
I oversaw the project from startup to completion. I managed the overall demo experience and the user interface. The demo would guide the user through 3 modules: Hand Tracking, Face Tracking and Demos.
In additional to my programming and design tasks, I also managed programming and art teams. This meant that we had all our tasks managed using Asana and that we'd have weekly scrum meetings to review new tasks and break through any roadblocks on older tasks. In addition, I was also the middle man between Intel and Human Engine LLC. This meant that I would ensure that any changes requested by Intel were properly managed and specced out while also keeping them in the loop when major milestones were hit or any deviations from timeslines were needed.