Hey, I'm Kennan!
I first started programming on calculators with TI-BASIC back in high school, and I've been addicted ever since. In my spare time, I love cycling, skiing, and contributing to open-source software. I was an active competitor in the Rubik's Cube speedsolving community, where my highlights included a North American Record, 90 podium finishes, and global 12th place rank for the 3x3 Rubik's Cube.
I completed my BS and MS in Computer Science at Case Western Reserve University with my thesis "Dynamic Structure Adaptation for Communities of Learning Machines," supervised by Dr. Soumya Ray. Now, I'm a software enginer at the Johns Hopkins University Applied Physics Laboratory . My current work focuses on applying NLP and ontological knowledge representation detect early stage biothreats using public information.
- Designed and graded course assignments and exams
- Taught supplementary lectures to reinforce course material
- Hosted weekly office hours to answer questions and provide feedback on theoretical written work and programming assignments
- Designed random forest, Bayesian network, and deep learning models for viral and bacterial threat classification using Scikit-Learn, Tensorflow, and Keras
- Implemented a data processing pipeline using dna2vec to perform feature extraction and dimensionality reduction from sequenced DNA
- Contributed to a large scale Angular application to provide an online learing approach to automated document tagging and classification
- Worked in a Kanban development environment to quickly and effectively produce thoroughly documented, tested software for contract sponsors
- Contributed to an Android application written in Java and Kotlin to implement an attachment cache mechanism, reducing form upload time by as much as 75%
- Developed a web application with spring boot backend, Angular frontend, and Selenium unit tests, utilizing an internally designed UI library to deliver Elasticsearch social media analytics
- Developed a job scheduling application from scratch in Angular and designed an algorithm to optimize job scheduling to increase data throughput.
- Designed a Postgres database model to store jobs and their associated data, and built a corresponding REST API to allow application interaction.
- Created an Amazon AWS management server to create and destroy EC2 instances to efficiently allocate funds and expedite job processing.