Just finished a semester at OSU, and with that I will begin further updating my website with some of the projects and experiences I have had during that time.
First of all, I recently updated the website to be more SEO-friendly (and readable in terminal browsers!) through the use of grunt-static-handlebars. With this, I no longer need Angular to process my data, as it is compiled into a static HTML file from a Handlebars template. I will also be reworking the layout a bit to be a bit more mobile-friendly, among other things.
This semester has been filled with incredibly valuable learning experiences and opportunities. Most importantly:
I also have some exciting future plans coming up. Next semester, I start work as a research assistant at the OSU ElectroScience Laboratory, and this summer, I will be working as a software engineering intern at Google as part of the AdMob team!
Changed some design aspects of the site, added more projects, updated the bio. Basically everything I've been doing within the last few months.
As of recently, I am at the end of a summer internship at BloomReach doing web development using their big data technologies. I had a great time working for the company and enjoyed experiencing startup culture, as well as the rest of the bay area! From many of the meetups, events, and hackathons I attended, I had a blast and enjoyed learning about new technologies as well as new ways to use the technologies I was already familiar with. I will definitely take my time here as motivation to bring myself towards greatness even further while at my second year of school.
Some of the photos of the things I have been doing for the past few months (including winning a robot competition at OSU) are shown below.
So the website looks a bit different now.
I decided to switch from the old jQuery-UI based layout to one using Google'#39; Polymer UI components and AngularJS. Polymer is definitely something new to me, but I learned it throughout last week and really enjoy the approach it takes in making web components modular, much like the old days of HTML, and can definitely see it being successful in the future. Right now some of the default components (like the top bar you see) can be a bit difficult to customize, but they are also using the flexbox layout system, which I am currently getting myself familiar with.
With AngularJS, I am attempting to derive all of this site'#39; content from JSON and loading it into the Polymer-based template I have created. I have a JSON format that works well so far, but it will be interesting to see if that continues to suit my needs--I may even implement a web editor for this in the future.
Ever since I was young, I have always been eager to discover, learn, and make components of the world around me.
I first started discovering and working for my this goals in high school. It was there I discovered my interests in engineering and programming through taking computer science classes and helping the school robotics team with CAD and programming. Meanwhile, I challenged myself with advanced coursework in subjects such as physics, calculus, and chemistry. Near the end, I had my first experiences in the workplace with an IT internship at Multi-Color Corporation and a software development internship at Zakta, and also began to learn more advanced concepts on my own.
College brought forth a new wave of experiences, faces, and things to learn, and I begin my time at The Ohio State University by participating in the Fundamentals of Engineering for Honors (FEH) program. This challenging sequence of classes broadened my overall perspective of engineering while teaching me important skills such as technical writing and teamwork. In particular, I loved the labs focusing on electronics, and couldn't wait to learn and do more with the various tools I had worked with. So without hesitation, I jumped headfirst into the field by buying an Arduino kit and exploring the ins and outs of microcontroller-based development. By the end of the semester, I had completed my first major project: a control system for two different types of higher-voltage lighting (link). I also continued to develop my software skills by learning from talks at OSU's Open Source Club and participating in my first hackathon with a web-reading alarm clock application (link).
The next semester I arrived eager to learn even more, and FEH brought just that with a challenging semester-long robot project, requiring teams of four to create a fully autonomous robot that can solve a sequence of tasks in under two minutes. I functioned as the main programmer and tester of the team, and throughout this semester we had our share of issues and speed bumps, but finished the final competition by earning first place in the elimination round. To find out more about this project, you can view it in the Projects tab. Throughout the semester, I also continued learning Arduino and web development through several different projects, and worked with the medical center's imaging department to create several useful plugins for ImageJ, an image processing program.
The summer after my first year involved an internship with BloomReach in Mountain View, CA. As someone who had never been to Silicon Valley before, the atmosphere of the area was amazing, with new skills to learn and people to meet everywhere. In addition to working on a challenging internal web application for data parsing here, I participated in three different hackathons. All three of these taught me about team dynamics and how to create something amazing in very little time. In the latter of these three hackathons, the Meteor summer hackathon, my team built a web application for simultaneous control of a drone that won us the prize “Most Entertaining”, and we presented as part of the Meteor Devshop SF the week after! (link)
My second year at OSU so far has helped me grow even more in my studies. With my first ECE classes underway, I am learning critical core electrical engineering concepts, and am beginning to develop my skills in these in projects such as a Pong implementation on an FPGA. I also participated in two more hackathons, creating a gesture-controlled robotic zen garden (link) and conducting simulator (link). To continue developing my software skills, I started a part-time internship working on the backend system for disability services company all R friends. Meanwhile, I am continuing to work with OSU's image processing department, this time on a lightweight image viewer in GTK.
The next semester and beyond holds even further projects and opportunities. Supervised by Dr. Arnab Nandi, I will be part of a group of four building a modular CNC capable of 3D printing and other tasks. I also will begin a position as a research assistant in the OSU ElectroScience Laboratory. In May, I will be starting a software engineering internship at Google.
My main goal is to help create and shape new technologies that improve the lives of millions of people. In doing this, I plan to use my skills in the most effective way possible, work for the good of everyone, and look for ways to change the course of the electronics and software industries for the better.
I am currently completing my bachelor's degree in Electrical and Computer Engineering at The Ohio State University. During the remainder of my time here, I plan to take on challenging classes that focus on digital design and circuits while giving me an overall broad perspective of the field. I also plan to complete a thesis project by the time I graduate. Following this, I plan to complete a research-based master's degree, and then enter industry with a challenging position focusing on digital design or embedded systems.
Of the fields I have explored so far, I enjoy low-level programming and digital design the most: working with underlying hardware as closely as possible lends itself a certain level of challenge but also freedom, and makes my work feel that much more rewarding when I understand everything that is happening. I am particularly interested in the rapidly growing industries of robotics, 3D printing, and wearables, as I feel we have only tapped the surface of what they offer. For example, 3D printing has made rapid prototyping easier than ever, but the technology is still not as accessible or capable as it could be. In addition, the precise mechanics used in 3D printing could be applied to a variety of unexplored applications. I look forward to working towards unlocking the untapped potential in these new technologies.
I am interested in working in an environment that is fast-paced, team-oriented, and is focused on forging new ground in an emerging embedded systems-related field, whether this be a large tech company or wearables startup. All the while, I will work with unrestrained motivation to solve difficult problems, and work for the benefit of both the company and the general public.
This project was created at TownHack 2014 in Columbus, Ohio. The code can be found here.
With this project our goal was to control the playback of music using conducting motions. We ran into many problems in figuring out how exactly to control music, but ended up with a working solutions that used VLC's HTTP API. Other ideas discussed were using MIDI signals to control Mixxx or to interface with Pure Data, but we ran into issues with Mixxx's documentation and getting sound stretching (altered speed without pitch change) working on Pure Data would have taken too much time. One problem in particular was CORS protection on VLC's API: our Ajax calls were being denied. To circumvent this we used a PHP script that served as the authentication mechanism for VLC.
The finished project utilized two different controls: the speed of the hand's motion to change the tempo of the music, and the height of the hand's motion to control the volume of the music. This change triggered every 4 beats in order to keep the music up to date but not call the API too much or misinterpret a single control.
Future goals for this project revolve around making in an entirely client-side application. This includes authenticating with VLC without the use of PHP, using beat-detection software to avoid manual tempo input, and better following of conducting controls (for example, holding a note if the user's hand stops moving).
The goal of this project is to automatically create sheet music from the notes heard by a user. The project is still not completely accurate and user-friendly, but a working model exists using Pure Data and the Python library Abjad.
The Pure Data patch takes several inputs from the user: the number of beat subdivisions to record, the tempo, and the number of measurements per beat subdivision. The user then hits the start button, and can see the beat represented through a blinking dot. After pressing stop, the data is written to a text file that can then be interpreted by a Python script. This script constructs the sheet music and displays it to the user in PDF form.
This project was created at OHI/O 2014. The code can be found here.
Mechanically, this utilized spare printer parts to do both the X and Y-axis motion, mounted by a wooden frame. These motors were driven by an H-bridge and controlled using an Arduino Uno. On the software side, the Arduino ran a sketch that used a one-byte control system to determine how to drive each motor, which was sent using the node-serialport library via a Meteor application. This Meteor app received signals from a Leap Motion on the front end using the LeapJS library and then used a two-point calculation to generate velocity signals for the Arduino. Following the hackathon, we learned from our experiences here and have begun work on a 3-axis CNC, supervised and funded by Dr. Arnab Nandi. A brief summary of the current project is available here.
This project was created at the 2014 Meteor Summer Hackathon. It won in the category Most Entertaining. The code for this can be found here.
This app utilizes the Twitch Plays Pokemon method of control to allow many users to control a drone at once. It provides two modes of control: anarchy (queue of commands) and democracy (popular voted command is used). In order to host this on Meteor'#39; servers, PubNub was used to communicate between the Meteor server and a client server that is controlling the drone. The app was also demoed live at Meteor Devshop SF in July 2014.
This project was created at HackSummit 2014 in San Francisco. I worked on a team of 10 for this project, and managed the API layer in a Django application. The backend can be found here and the frontend can be found here.
This app was created in response to the Kiva challenge at the hackathon which involved creating a recommendation engine and UI for loans on Kiva that read data from various sources including Facebook. The team (because it was large) decided to split the application into separate frontend and backend servers to also make way for an iOS app. The application uses a Netflix-style UI to display different loan categories to the user, and utilizes an API that can be easily configurable to accomodate different categories.
This project was created at Cardinal Health Codefest, and won the popular vote prize.
It provides an information service to patients, containing a "bulletin board" where they can share or see advice from others, a buddy system in which two patients with the same condition can sign up to email each other, and online news information about recent developments in the condition.
Using a Sparkfun MIDI shield and a Wii Nunchuk, I replicated the ocarina from the Legend of Zelda series with an Arduino. The code for this is located here, and contains songs from both Ocarina of Time and Majora'#39; Mask
I ran into several interesting challenges with this. Parsing songs with JSON was not the easiest task to do, and I was not initially able to do this with the existing libraries as of January, but used a new library that appeared several months later to parse JSON strings. Memory management was (and still is) also an issue--the program was mysteriously not running properly until I tried running it on an Arduino Mega. I may still look into ways to cut down on memory usage, but the variety of songs held in memory already make this difficult.
This robot was created as a part of the FEH (Fundamentals of Engineering Honors) robot competition at the Ohio State University. The robot won first place in the elimination round of the competition with a score of 89 out of 100 possible points.
The tasks required included pressing a button repeatedly, turning a switch from right to left, picking up a skid, placing the skid in a container, dropping a small spoon in a bin identified by a color on the floor, and pulling out a pin lodged in a pipe.
I worked on the code and testing of this robot. The code is located here. It was written in C++ for the internally developed Proteus board, and utilized a modular sequence of steps using input from multiple sensors.
This project uses an Arduino to control both EL wire and an RGB (non-addressable) LED strip simultaneously. It also reads input from a potentiometer, two buttons, a PIR sensor, and displays text to an LCD screen. The wiring diagram is show below.
The source code for each program can be found here.
The first program does not utilize the buttons or LCD. It gradually fades the LED strip through the entire color spectrum and blinks the EL wire in an interval specified by the position of the potentiometer. After doing this for a certain amount of time, both components shut off to save power. They are reactivated when the PIR sensor detects motion.
The second program performs similarly to the first program, but at the start it asks the user to input a sequence of colors by twisting the potentiometer to specify levels of red, green, and blue. After each color has been entered, the user can hit the bottom button to add another color or the top button to finish and start the sequence. Then, the EL wire performs as described in the first program while the LED strip switches between the user-defined sequence of colors in the same time interval the EL wire blinks. Both lights shut off after a certain period of time to wait for motion. The user can also press the top button during the light sequence to enter a new sequence of colors.
I had my fair share of problems and issues during the implementation of this project. Here are some of the ones I solved along the way:
And some improvements that could still be made:
This project was created as a team of 4 for the OSU 2013 hackathon and can be found here.
In this Windows application. The user can customize an alarm clock sequence to read text or web data by adding modules and delays in a timeline. The modules included are weather, simple text, RSS, and calendar, but many more could be created simply by implementing the IModule interface. After that, they simply must be put into the project directory--no modification of any of the program files needed.
Future goals for this project include porting it to mobile (where it really should belong, but no one on the team was experienced in mobile development) platforms such as Android and Windows Phone, as well as adding more modules and cleaning up the interface to be more user-friendly. The application could also take voice commands, which we considered adding at the hackathon but Microsoft'#39; speech-to-text library included in C# was not accurate enough.
This project was created with another person for the first semester of Fundamentals of Engineering Honors at OSU and can be found here.
The goal of this project was to detect one of 5 different possible frequencies using the FEH Proteus board and an IR receiver. The documentation can be found here. (note: the viewer displays the numbering incorrectly)