projects

Here is a chronological list of the technical projects that I’ve done so far, from most to least recent. These projects come from a variety of sources – clubs, classes, summer programs, and personal motivation. The topics and skills that I’ve learned include frontend and backend development, SVG processing, C++ scripting, and basic machine learning algorithms. This page only includes brief descriptions, but I’m more than happy to elaborate on anything here!

computer vision for navigating stata center (Spring 2024)

navigation system pt. 2, but with computer vision: for my final project for 6.869x/6.8300 (CV), my teammate and I developed an image classifier + path planner to help navigate through Stata center specifically. The user submits an image they took to our model, and the model would classify the image as belonging to 1 of 10 regions that we defined. Then, we compute the path to the user’s desired destination and display this path on a modified map that we return to them.

sarcasm detection (Fall 2023)

For my final project for 6.869x/6.8611 (NLP), my team and I created a sarcasm detector that classified an input review as sarcastic or non-sarcastic. In addition, we generated plots that identified which words were interpreted “more” or “less” sarcastic by our model.

autonomous racing and line-following (Spring 2023)

I took a class called 6.141/6.4200, or Robotics: Science and Systems. Over the course of the semester, I worked with a team to program a robotic racecar to do various tasks, including wall-following, line-following, object recognition, and path-planning. We worked with the Robot Operating System (ROS) ecosystem, which basically added a framework of publishers and subscribers of information via Python code. I have pretty much never done robotics in my life, so this class was a huge learning experience for me, but it really opened my eyes to robotics. I learned so much about implementing and tuning various algorithms, like how to turn corners when line-following, PID control, pure pursuit, color segmentation for object recognition, Monte Carlo localization, simultaneous localization and mapping (SLAM), rapidly-exploring random trees (RRT), A*, and Hough transforms, just to name a few. Most importantly, this class taught me immense perseverance and the ability to methodically evaluate tons of potential causes to fix the root of a problem. This class culminated in a final race on MIT’s indoor track, as well as navigating and parking through a mini “city”. Here’s the website documenting all of our lab slides and reports: https://rss2023-15.github.io/website/ and here are some videos from our final challenge!

racing in Johnson track! our robot is right in the middle, and finished second in our (very contested) heat

modeling radiation in the human brain (Winter 2023)

In January 2023, I participated in a research abroad program with the University of Sannio in Benevento, Italy. My project was with Professor Vincenzo Galdi, and focused on simulating the radiation from a potential new MRI technique. This new technique would incorporate metamaterials in a traditional MRI setup to enhance the signal-to-noise ratio (SNR), but the effects of this enhanced signal on the brain were unknown. I learned and used COMSOL, a multiphysics simulation software to create the structures, set magnetic properties, and produce some figures. I’m really grateful to have had this experience living and studying abroad, and I greatly treasure the hospitality that I received! Benevento will always be special to me.

lyric.al – a cross-platform music-sharing service (Winter 2022)

For 6.148 (web lab), my team and I decided to create a cross-platform music-sharing service. Creating collaborative playlists between two Spotify users is straightforward enough, but what about friends who use different platforms? Our goal was to create one shared site where you could have all of this information – friends’ favorite songs, playlists, and make collaborative playlists over all platforms. I designed the UI of the website, choosing a color scheme, making wireframes on Figma, and coding up the looks in HTML & CSS. I’ve included some of the wireframes that I made here, and here’s a link to our website: [link].

Wireframes of our website using Figma! We ended up using the second frame on the left for our final design.

but where exactly is it? – computer vision to navigate MIT (Fall 2021)

MIT’s campus is a confusing place, and it doesn’t help that everyone talks about everything only in terms of numbers. As part of AIM Labs, a student group focused on creating applied machine learning projects, my team and I ideated and developed a navigation tool for MIT (like Google Maps). I worked on processing SVGs of MIT floor plans to identify doors, elevators, rooms, and room text so that we could create graphs from floor plans and identify paths. I can’t publicly share the code that we developed since we’re still expanding on the project, but here are our slides from the final presentation: slides.

the final product!

code that I helped come up with that identifies paths that are doors using good old geometry. identifying doors was crucial to the rest of the app, which relied on line-of-sight to show a path between two points

machine learning with python – online coursework (Summer 2021)

During summer 2021, I wanted to finally learn about some basic machine learning concepts and try my hand at implementing them. So, I audited the online course Machine Learning with Python by IBM. I’ve uploaded my work for the assignments to this repository, which includes regression, k-nearest-neighbors, k-means-clustering, decision trees, and svms.

never fall flat – music storage and transcription system (Spring 2021)

I created a full end-to-end music transcription system with my group for the final project of my Interconnected and Embedded Systems class (6.08). All of us were musicians at some point, and shared a common grief that we didn’t have the ability to instantly play any song that we heard. Thus, we created a distributed system that could record melodies through an Arduino, send this information to be stored in a Python server database, display a YouTube video of that song, and accurately draw sheet music of that melody on an LCD screen. I built our backend structure and databases in Python and SQL, wrote part of our HTML web interface, and created the C++ note transcription script from scratch. Since this was for a class project there is even less code that I can share publicly, but I’ve uploaded the note transcription script and here’s the final video of our system:

coveducation website – updating UI and video-conferencing (Winter 2021)

I worked as a student consultant for nonprofit CovEducation in January 2021. They were developing the newest iteration of their website, and our student team was tasked with updating some UI bugs, and integrating video-conferencing capabilities into their website. I used React to update their frontend, adding a carousel of user testimonials, and worked with my team to add in JitsiMeet, as well as doing some re-organizing of their Firebase backend to tie the video-conferencing and user database together. Here is the specific branch that I worked on: github.

the front page of coved.org – I updated these two sections with the correct headers and body text, and created the layout of “How CovEd Works”
a second section of the front page – I matched icons with subject names to create this layout

texans’ health – website from scratch (Summer 2019)

I created this project as the final project for the Girls Who Code Summer Immersion Program in 2019. This website focuses on enabling healthy eating decisions by showing a wide range of healthy recipes and local, inexpensive grocery stores. I built this website entirely from scratch using HTML, CSS, and JavaScript, creating every aspect of the UI and functionality. You can visit the actual site at https://texashealth20.github.io/#/ and the Github repository here.

the “Recipes” page of the site – each recipe was linked through its image, with an up arrow at the bottom right to return to the top of the page.
clips from the “Got Food?” page – above, a dynamic slideshow of food items; below, a searchable map from the Google Maps API