A skilled professional with strong expertise in both data warehouse modeling and modern data transformation using dbt (data build tool). The ideal candidate will be responsible for designing scalable data models and implementing efficient ETL/ELT pipelines to support analytics and reporting needs.
Design and implement scalable data warehouse models (star, snowflake schemas)
Develop and maintain fact and dimension tables aligned with business requirements
Build and manage data transformation pipelines using dbt
Translate business requirements into well-structured, reusable data models
Implement data validation, testing, and documentation within dbt
Optimize SQL queries and transformations for performance and scalability
Collaborate with stakeholders, data engineers, and analysts for data requirements
Ensure data quality, consistency, and governance standards
Integrate dbt workflows into CI/CD pipelines and version control systems
Strong experience in data warehouse design and dimensional modeling
Hands-on expertise with dbt (data build tool)
Advanced SQL skills for large-scale data transformation
Experience working with cloud data warehouses (Snowflake, BigQuery, Redshift, or Azure Synapse)
Solid understanding of ETL/ELT pipelines and modern data architecture
Experience with Git and CI/CD practices
Strong problem-solving and analytical skills
Data Warehouse Design, Dimensional Modeling, Dbt, Advanced SQL, ETL/ELT Pipelines, Data Transformation, Data Modeling, Fact And Dimension Tables, Star Schema, Snowflake Schema, Cloud Data Warehousing, Snowflake, BigQuery, Amazon Redshift, Azure Synapse, Git, CI/CD, Data Validation, Data Testing, Data Documentation, Query Optimization, Data Governance, Data Quality Management, Problem Solving, Analytical Skills