PNC Bank · MIT Lincoln Laboratory · Brown MSc ’23
I didn’t start in product, I started by building. Before becoming a product manager, I was a data scientist, learning technologies like LLMs by implementing them in real systems not presenting about them. I’ve shipped models that drive measurable business impact and navigated the realities of deploying in regulated environments, where every decision is questioned and every assumption is tested. That experience fundamentally shapes how I build products today. That experience informs how I build products today—anchored in first principles, adaptable across industries, and driven by data rather than assumptions.
I first studied at New Jersey Institute of Technology in the Albert Dorman Honors College, majoring in finance and computer science. NJIT is one of only a few schools on the East Coast with men's volleyball, and I was fortunate enough to receive a scholarship to play. After graduating in only three years, I went on to Brown University where I earned my MSc in Data Science. Whilst there, I worked at MIT's Lincoln Laboratory building NLP intelligence infrastructure for a Department of Defense research program that must stay appropriately classified.
I later joined PNC as a Data Scientist and was tapped for the TPM role eight months later, stepping into a seat that would typically be filled from an MBA pipeline. My background is why I can sit with engineers and pressure-test model architecture in the morning, then translate the same conversation into business risk and return for a credit committee in the afternoon.
When I'm not working, the place I'm most likely spending my time is running — training upwards of 70 miles a week. I'm a competitive marathon runner, currently preparing to run 2:45 this April at the Jersey City Marathon. It will be my eighth.
I also have a creative side in photography. My subjects are mostly runners, but I shoot street photography and landscapes from my travels as well. My work has been featured by brands like New Balance and by influencers with millions of followers.












Fine-tuned BERT model inferring income from transaction history for customers without verified income on file. Unlocked 215K customers for credit line increases, added $8.6M in annual profit, passed Fair Lending review without a single adverse finding.
Pre-campaign data validation system using z-score drift detection against a 90-day rolling baseline. Caught a silent upstream field change at 11.8 standard deviations — blocked a campaign before a single decision ran and prevented a regulatory disclosure event.
Agentic AI portfolio surveillance system in development. LLM reasoning loops synthesize Kafka transaction signals, portfolio data, and macroeconomic indicators into auditable credit parameter proposals — replacing manual analyst-driven surveillance.