About the Role
We are looking for an experienced Data Engineer with strong expertise in dbt (data build tool) and working knowledge of to join a critical migration project. This role involves translating legacy SAS processes into a modern ELT architecture built on cloud data platforms, ensuring robust data quality and scalable solutions.
You will work closely with data teams to analyze existing SAS-based workflows, translate business logic into dbt SQL models and implement rigorous testing strategies to ensure successful migration.
Key Responsibilities
- Analyze and interpret existing SAS programs, macros, and procedures.
- Migrate SAS-based data transformations into dbt models following ELT best practices.
- Develop dbt tests to ensure data quality and validate migration accuracy.
- Implement efficient data pipelines aligned with modern data warehousing principles.
- Collaborate with data engineering teams to ensure smooth integration into cloud platforms.
- Manage version control and deployment using Git.
Required Skills & Experience
3+ years in data engineering, business intelligence, or analytics.2+ years hands-on experience with dbt (Cloud or Core).Strong SQL skills and solid understanding of data modeling techniques (e.g., star schema).Ability to read, interpret, and write SAS code including macros and PROC SQL.Experience with Git for version control and CI / CD practices.Good understanding of modern ELT pipelines and cloud-based data warehousing concepts.Nice-to-Have
Experience with cloud data platforms (Snowflake, BigQuery, Redshift, etc.).Familiarity with automated testing frameworks for data pipelines.Languages
English – professional proficiency.#J-18808-Ljbffr