Job Description-
We are looking for a strong Data Engineer with hands-on experience in building data pipelines, performing transformations, and working with cloud-based data tools. The ideal candidate must have solid SQL skills, a good understanding of data modeling, and hands-on exposure to DBT and cloud platforms (preferably GCP).
Key Responsibilities:
Build and maintain scalable ETL/ELT pipelines. Develop and enhance DBT models and transformations.
Perform data cleansing, validation, and quality checks. Support cloud-based data engineering workloads (preferably GCP). Write optimized SQL queries for analytics and data processing.
Collaborate with Data Engineers, Analysts, and Architects for requirement understanding.
Monitor pipeline health, troubleshoot issues, and ensure data reliability.
Document workflows, models, and mappings.
Required Skills (Must Have)
Data Modeling understanding (Star/Snowflake).
Strong SQL skills.
Hands-on experience with DBT.
Experience with at least one major cloud platform: GCP (BigQuery, Dataflow, Dataproc, Composer, GCS preferred) OR AWS / Azure equivalent services.
Experience with data transformation and loading technologies.
Good knowledge of Python for data tasks.
Please share your resume over Aa***********a@Co****e.Com

Keyskills: Azure DBT Snowflake Data Build Tool Five Tran
Coforge is a global digital services and solutions provider, that enables its clients to transform at the intersect of domain expertise and emerging technologies to achieve real-world business impact. A focus on very select industries, a detailed understanding of the underlying processes of those...