Projects

How I build

I build in Cursor and use Codex as my coding assistant. I start with a short spec and a couple examples, then let it draft the first pass. I make the architecture decisions, review everything, and iterate with testing until the behavior matches the intent, and I always run tests and review before shipping. For writing, I use AI to tighten and proofread, not to write for me.

Built out of necessity

Financial Normalization & Diagnostics Tool

Snapshot

I built an ETL tool to take raw QuickBooks exports, normalize the underlying accounting structure, flag common hygiene issues, and reliably generate three-statement outputs plus decision dashboards.

Problem

QuickBooks is great for running a small business, but it isn’t built for consistent analysis and visualization out of the box. Over time, coding habits drift and categories get reused inconsistently, so the same transaction type can land in different places depending on who entered it and when. Without normalization, trendlines get noisy, unit economics get distorted, vendor concentration gets overstated or understated depending on how things were coded, and cash becomes harder to reconcile to what the business actually did. The cleanup work becomes a manual necessity with a lot of automation potential: balance sheet activity shows up on the P&L, COGS ends up in OpEx, duplicates inflate spend, and month-to-month variance turns into bookkeeping artifact instead of operational truth.

Constraints

This was built in small-business reality. Bookkeeping wasn’t perfectly consistent, category definitions shifted over time, and I didn’t have time for deep manual cleanup inside QuickBooks every month. The outputs had to reconcile cleanly enough to support real decisions and hold up under scrutiny, while staying fast to run and easy to maintain.

Approach

I built a pipeline that starts from exports and produces a dataset I can trust. It standardizes the chart of accounts, applies consistent categorization rules, and flags duplicates and misclassifications—including balance sheet versus P&L placement issues and common COGS versus OpEx errors. From that normalized base, it produces a standardized P&L, balance sheet, and cash flow statement, then feeds dashboards for KPIs, vendor spend, and customer concentration patterns.

Output

The deliverables are a standardized three-statement set plus dashboards for core KPIs, vendor spend concentration, and customer concentration and spend patterns. I use it to explain variance quickly and tie it back to the exact transactions that caused it, then fix the source entries in QuickBooks so the next month is cleaner.

Tools
ETLExcelFlaskHTMLPythonRegexTie-out validation

Route Optimizer

Snapshot

I built a route optimizer and capacity model that turns a recurring field service portfolio into a weekly plan crews can run without guesswork. It reduces scheduling to a repeatable process that balances contract cadence with cost to serve.

Problem

In field service, routing decides whether the week feels smooth or chaotic. Most small operators schedule by hand and rely on memory, and that only works while everything stays predictable. As the portfolio grows and the week gets messy, the plan starts to degrade. Drive time creeps up, service time estimates drift, and what looked reasonable on paper turns into rushed work, missed visits, or constant midweek reshuffling. The tools I found were either priced for enterprise-level operations or built for delivery routing rather than recurring service with frequency requirements.

Constraints

Service time changes by property and season, crews change week to week, and the morning plan has to be simple. Customers have timing preferences, and weather can force a replan on short notice. The model needed to run fast, allow practical overrides, and still reflect how work actually gets executed.

Approach

I load the accounts and requirements and run the optimizer. It normalizes addresses, applies service time assumptions, estimates travel between stops, and produces routes that fit crew capacity while honoring cadence. It surfaces the small set of accounts that break assumptions so I can intervene where judgment matters instead of hand building the entire week. After the week runs, I use actuals to tighten service time assumptions so the plan improves over time.

Output

It outputs a printed weekly schedule by crew and a capacity view that flags overload and the few exception accounts worth reviewing before the week starts. I’m now working on packaging the program for broader use cases and building a frontend for it.

Tools
ExcelGenetic algorithmsGoogle Maps Distance Matrix APIGoogle OR-ToolsHierarchical clusteringHTMLILPPythonRegex

Built for the role

NLP root-cause clustering (Dover Food Retail)

At Dover, I built a program to turn messy contractor notes into structured root-cause signals. It read contractor notes, generated embeddings, then used dimensionality reduction and clustering to form hierarchical groups of similar notes. We ran a supervised exercise with Quality Engineers who reviewed clusters and assigned failure modes/root causes; as more clusters were labeled, the system got better at identifying root-cause failures and separating signal from noise.

It was a major upgrade over Excel-based keyword searches. In its end state, this enabled the quality teams to search for contractor notes for further review using natural language. The output fed a Power BI report used to brief Quality Engineering and product teams on what was failing in the field, and I handed the project off to Dover corporate’s data science team.

Tools
BERTClusteringDimensionality reductionEmbeddingsGPT-1Hugging FaceJupyter NotebooksHDBSCANK-meansPower BIPythonUMAP

Personal projects

Household Meal Optimizer

A long-running personal project that applies optimization principles to a home-economics problem. The target is to plan meals that fit needs, preferences, and goals while staying inside a real budget.

Tools
CSSFastAPIFlavorDB2Kroger APILLM validatorsMILP solverNext.jsNode.jsPostgresPythonSupabaseTypeScriptUSDA FDC

Lifting Log (iPhone app)

A small iPhone app I’m building for my workouts. Simple logging, low friction, and a clean history of training. It’s close to done.

Tools
CSSIndexedDBPWAReactReact RouterTypeScriptViteVitest