Note: Apply only IMMEDIATE JOINERS
Mandate: 2nd Technical Round is Face to Face Round (Hyderabad/ Chennai)
Job Description:We are looking for an experienced GCP Data Engineer to design, develop, and optimize data pipelines and solutions on Google Cloud Platform (GCP).The ideal candidate should have hands-on experience with Big Query, Dataflow, Py Spark, GCS, and Airflow (Cloud Composer), along with strong expertise or knowledge in DBT.Key Responsibilities:Design and develop scalable ETL/ELT data pipelines using Data Flow (Apache Beam), Py Spark, and Airflow (Cloud Composer). Work extensively with BigQuery for data transformation, storage, and analytics. Implement data ingestion, processing, and transformation workflows using GCP-native services.Optimize and troubleshoot performance issues in Big Query and Dataflow pipelines. Manage data storage and governance using Google Cloud Storage (GCS) and other GCP services.Ensure data quality, security, and compliance with industry standards. Work closely with data scientists, analysts, and business teams to provide data solutions.Automate workflows, monitor jobs, and improve pipeline efficiency.
Required Skills:Google Cloud Platform (GCP) Data Engineering (GCP DE Certification preferred) DBT knowledge or experience is mandate.BigQuery Data modeling, query optimization, and performance tuning PySpark Data processing and transformation GCS (Google Cloud Storage) Data storage and management Airflow / Cloud Composer Workflow orchestration and scheduling SQL & Python.
Preferred candidate profile :We need a Sr Data Engineer with Data Modelling experience, teraforms, and should be able to lead the project end to end requirement gathering, design, client management and delivery.

Voice Process Requirements\r\nLanguages:\r\n\r\nEnglish & Tamil\r\nEnglish & Hindi\r\n\r\nEligibility Criteria:\r\nGood communication skills\r\nGraduates only (No active backlogs)\r\nCandidates (no far-location profiles)\r\nSalary: Up to 20,000/month (depending on performance)