Lighthouse

Project Catalyst (Logistics)

A strategic consulting engagement where we modernized the data infrastructure of a large logistics firm. We replaced fragile SQL scripts with a robust Engineering pipeline.

The Challenge

The client had 20 years of data locked in a slow, on-premise Oracle database. Reports took 2 days to generate, making real-time decision making impossible.

Our Solution

We implemented a "Modern Data Stack". We used Airflow to orchestrate data extraction, loaded it into Snowflake for infinite scale, and used dbt to transform it into clean business logic models.

Tech Stack

Python Airflow Snowflake dbt

The Team

architect
Huy (Data Lead)
orchestrator
Huy Dang (Account Manager)
craftsperson
  • Data Engineer

Strategic Impact

Reduced reporting time from 2 days to 15 minutes. Empowered the client's analysts to build their own reports. Validated our "Data Competency".

Key Lessons Learned

"The "ELT" (Extract-Load-Transform) pattern is vastly superior to traditional ETL."

"Teaching clients to fish (dbt) is better than fishing for them."