Internet of Things Scoreboard

BattleSnake competition display

May 2019

A physical button, connected to a NodeMcu Lua Board (ESP8266), incremented our own and the competitors' score in DynamoDB during Victoria's BattleSnake competition. This is an improvement over using pen and paper, and allows Redbrick to have some branding at the table. A simple react app updates in real-time with the latest scores. It also used AWS Cloudmap and Fargate.

Wedding RVSP Microservice

A protected and central event info page

May 2019

I used S3 redirect to add a new page route to my existing website. This page was protected by a token code written on each guest's mailed invitation. A DynamoDB table managed their food choice, dietary restriction, and song suggestion.

Big Data / MapReduce analysis: Implementing a Microservice

Data Mining

Any user can download their personal Facebook information from the settings page. Using some of this data, this project analyzes their message habits and their top 3 messenger friends. The first iteration of this project implemented MapReduce to analyze message history. Written in Go, this distributes and parallelizes processing of their data since the day they joined Facebook. Data is stored in-memory using an efficient Prefix Trie. The final output is a pie chart of their most active hours. The second iteration is proof-of-concept that implements a server-based microservice architecture. Users upload their file via the front-end, which triggers pre-processing on an EC2 instance, followed by MapReduce on an EMR cluster, and finally post-processing on the same EC2 instance to generate a png graph. This graph is displayed in-context on the upload page. Finally, the third iteration hopes to implement a serverless microservice using Amazon Lambda to trigger the EC2 processing and EMR analysis.


Frequent Itemset Mining with 1% support

Data Analysis and Distributed Computing

February 2018

I continued work on data mining and software engineering research already done in this field. I tried to improve the performance of a new algorithm, a variant of the existing Apriori algorithm standard for calculating frequently co-occurring items in transactions. The new algorithm is an improvement by requiring a 1% support level instead of 5%. Ultimately, this change returns significantly more useful results for use in, for example, a reccommentation system. My changes included implementing a custom RecordReader, RecordWriter, CombineInputFormat, and custom OutputFormat to merge the results for each Mapping phase into one file to be evenly resplit onto several mappers (while maintaining a global ordering), instead of having several mappers excute on wildly different file sizes. This was done in Java using the Apache Hadoop framework. This project and repository are private.

Angular Web App

First iteration of a personal website

September 2015

First attempt at a personal site supported Angular 2 skills needed at SKIO Music. Like this current version, it routed to multiple pages, followed best practices for CSS styling, and was stored as a static site in S3. A few experiments hosted within the site used a Django backend.

Software Evolution: Research Hub

Challenge: Map the Evolution of Python

January 2015

As part of a research effort, students needed to create an automatic tool, script, or method to investigate a code base to see how it evolved over time. Each product needed a textual description and visual result of its findings. After analyzing Python, my team built a single-page Angular app to unify each team's research into one cohesive tool. It also provided a single place to find resources such as papers and lectures related to the topic. Users were able to search for any of the 10 tools - either by codebase or language - and execute them within the app. The full results would be displayed in-browser.

Tutorial (Screenshots)


Python GUI and Automatic Scripting Program

Word-of-the-day Email Parser

Early 2014

To increase German vocabulary and practice while living abroad, I subscribed to a mailing list that sent up to 2 emails per day. The message included a new word, its translation, and example sentence in both German and English. I wrote a script to parse each email into a text file to study later, then delete the email from my inbox. Result included a Python GUI to do things such as: enter email login information, select either POP3 or IMAP access to the inbox, and save file location.

Tutorial (Screenshots)