I'm Dustin Cole. I lead meter analytics at LG&E, where I manage the infrastructure behind 700 million interval rows a day. I build predictive models, migrate data ecosystems, and design reporting tools that help people make real decisions.
My career started in operations. I was supervising accounts payable at Baptist Health, consolidating five hospital departments, migrating finance systems, and at some point started building dashboards for the executive team. That part stuck.
From there it was healthcare supply chain analytics at Baptist, then LG&E for the last several years -- working my way from spend analytics and contractor cost modeling up through building the full meter data infrastructure that the utility runs on today. I've worked across Oracle, Qlik, R, Python, Power BI, and Snowflake. The thread through all of it is the same: figure out what the business actually needs to see, then build the thing that shows it.
The work I'm most proud of tends to be the kind that gets used every day by people who have no idea how it works, and that's exactly how it should be. A predictive model for meter overheating that operators rely on each summer. A drill-down dashboard ecosystem that goes from system-level status down to an individual meter's interval history. A Snowflake migration that moved 700 million rows a day off aging on-prem infrastructure and into something the whole organization can build on.
Outside of work I'm finishing a Master's in Data Science at Indiana University, writing a weekly newsletter called The Data Nerve, and learning what it means to be a dad to a one-year-old who has better things to do than look at my dashboards.
The best data work is invisible. It runs in the background, gives people confidence in the numbers, and gets out of the way.
Career History
Click any role to see the full story β what was built, what the stakes were, and what it taught me.
This is where data clicked for me. I was brought in to consolidate AP departments across five hospitals and all the doctor offices into a single system. That meant a full migration to PeopleSoft for finance and supply chain, setting up AnyDoc for OCR invoice processing, and rebuilding how payment workflows were tracked and approved.
Once things were running, I built executive KPI dashboards in Oracle Cloud Analytics pulling AP data from across the entire health system. That was the moment I realized I cared more about the dashboards than the AP work. The whole rest of my career has been chasing that feeling.
This role is where the interest in analytics started. Everything after it was intentional.
Moved into a dedicated analytics role focused on supply chain spend. The core work was physician preference items and value analysis automation β which sounds dry until you're in a room with surgeons and hospital executives debating whether to spend more on a specific type of suture because one doctor prefers it.
The range was genuinely wild. Toilet paper to MRI machines. I worked alongside medical councils to help them balance quality, value, and cost with actual data behind the decisions, not gut instinct. Qlik was the primary BI tool here, and I leaned heavily on Excel VBA to automate the parts that kept eating hours every week.
This is where I learned that good analysis has to connect to a decision someone is actually going to make. If it doesn't change behavior, it doesn't matter.
First role at LG&E, moving into utilities. I administered the TRAC contractor time-tracking system and managed approval workflows for contractor labor at power plants, all the way through to Oracle finance payments. The analysis side covered blanket purchase agreements, purchase agreements, and invoice spend for commercial operations across all plants.
Then COVID hit. Plant managers suddenly needed to make decisions about how many contractors to keep on site under different shutdown scenarios β and there was no tool to help them think through the costs. I built a sequestration model in Excel VBA that modeled cost outcomes across multiple plant configurations.
This is the role where everything accelerated. I built ETL pipelines using R scripts triggered in SSMS, running on a VM I helped configure. Then came the Power BI ecosystem β pulling from seven-plus systems and APIs β built from scratch and grown into something the operations team actually relied on daily.
The flagship dashboard drilled from system-level meter communication health down to a single meter, then out to an ESRI map showing that meter and its neighbors, then into Google Maps, then into paginated interval data reports showing every reading β gaps, overheating events, threshold proximity, full history. One tool, one continuous drill-down from the grid to a single device.
On the modeling side, I built a Poisson regression model to forecast high-temperature meter events. Inputs: meter population, cooling degree days, weather year methodology (historical same-day temps across 20 years), and prior high-temp events. Then added Monte Carlo simulation and a live weather.gov API integration to predict overheating risk 7 to 10 days out with confidence ranges.
Leading analytics and operations for the entire AMI metering program. The first big lift was a full cloud migration β moving roughly 700 million interval rows per day from on-premise infrastructure to Snowflake. The entire data ecosystem followed: pipelines, 100-plus Power BI reports, governance, all of it.
The current focus is on AI integration. I've built Python pipelines using LLMs and Snowflake Cortex AI so operational users can ask questions in plain English and get back SQL-generated charts and tables β no query knowledge required. Semantic models define the business logic underneath. I also piloted Copilot in Power BI for graph development with the team.
Outside of the core job, I've built personal AI agents using Twilio, Make.com, VAPI, and Formspree, built websites, and am actively integrating AI tools into every workflow I touch. Daily user of Claude, ChatGPT, and a rotating set of other LLM tools. This isn't exploration β it's how I work.
Every skill here has a project behind it. Hover any tag to see where it was applied. The dot shows depth: teal is expert, blue is proficient, gold is applied.
These aren't portfolio demos built for a GitHub audience. They're tools that plant managers made decisions with, operators checked every summer morning, and a whole utility runs on today.
This was the migration of an entire on-prem data operation β pipelines, storage, reports, governance β to Snowflake's cloud architecture. Roughly 700 million interval data rows per day, moving from aging infrastructure that the team had outgrown, into something the whole organization could actually build on.
The hard part wasn't the migration itself. It was doing it without breaking the 100-plus Power BI reports that operations teams checked every day. Every pipeline had to be rebuilt. Every report had to be validated. Every data type had to behave the same on the other side. It did.
Built to predict meter overheating events 7 to 10 days out. Inputs: meter population, cooling degree days, a 20-year weather baseline built using weather year methodology, and prior high-temp events. Added Monte Carlo simulation for confidence ranges, then wired in a live weather.gov API so forecasts updated automatically with real forecasts. Operators use this every summer to decide where to stage crews.
A drill-down dashboard that starts at the full meter network and goes all the way down to a single device. System communication health, individual meter status, ESRI map with geolocation of that meter and its neighbors, click-out to Google Maps, then into paginated interval reports showing every reading β gaps, overheating events, threshold proximity, full history. One tool, one continuous path. Seven-plus source systems feeding it.
Built Python pipelines using LLMs and Snowflake Cortex AI so operational users can ask questions in plain English and get back SQL-generated charts and tables β no query knowledge required. Semantic models define the business logic underneath so the language layer knows what the data actually means. Users who've never written a line of SQL are pulling operational reports on their own.
When COVID hit in 2020, plant managers had a real problem: no tool existed to help them think through the cost of different contractor sequestration scenarios. I built one in Excel VBA. Input the plant configuration, choose the scenario, get the projected cost outcomes. Distributed to managers across all five-plus plants. They made real staffing decisions with it during a period when getting those calls wrong was expensive.
Built a stack of personal AI agents using the tools I had: Twilio for SMS and voice channels, VAPI for voice AI, Make.com to wire together automated workflows, and Formspree for form-based triggers. These aren't demos. They handle real tasks β routing, notification, response, follow-up β without me touching them. This is what AI integration looks like when you stop reading about it and start building.
A weekly newsletter on LinkedIn for data professionals who want the practical side of AI and analytics β not hype, not fluff, actual things that work in real environments. Every issue is written from the perspective of someone who's actually using these tools in production, not summarizing someone else's blog post.
A $47 digital product for data professionals who want a practical framework for integrating AI into their analytics workflow. Not a beginner's guide to ChatGPT. A practitioner's playbook built from real experience doing this in a production environment β what to prompt, how to structure it, where AI actually helps and where it doesn't.
The same frameworks I use in production at a major utility, packaged for data professionals who want to actually use AI in their work -- not just read about it.
This isn't a beginner's guide to ChatGPT. It's a practitioner's playbook built from actual experience integrating LLMs into production data workflows -- what to prompt, how to structure it, where AI genuinely helps and where it gets in the way. Written by someone who runs 700 million rows a day through a Snowflake environment and uses Claude and ChatGPT as daily production tools.
Who it's for: Data analysts and scientists who are curious about AI integration but haven't found a framework that connects to real analytics work -- not toy examples, not abstract theory.
Next product
The Snowflake migration at LG&E took 700 million rows a day from aging on-prem infrastructure to a cloud architecture the whole organization can build on. That process is repeatable -- and worth documenting properly.
A practical guide is coming. The kind that covers what the docs don't: what breaks, what order to do things in, and how to migrate 100-plus reports without blowing up a team's morning routine.
I'm open to conversations about roles, projects, or anything in the data and AI space that's worth talking about. I don't do cold calls or generic networking. But if you've got something specific on your mind, I'll read it and respond.
The fastest way to reach me is LinkedIn. The form works too -- I check it regularly. If you're reaching out about the AI Playbook or the newsletter, those links are in the Products and Newsletter sections above.
I'll respond within a day or two. No auto-replies, no sales sequence -- just me reading it.
No newsletters, no spam. This goes straight to my inbox.