Design and implement scalable, secure, and efficient data pipelines using GCP tools such as BigQuery, Pub/Sub, and Dataflow.
Collaborate with cross-functional teams to gather requirements and develop solutions that meet business needs.
Develop Python scripts to automate tasks, monitor system performance, and troubleshoot issues.
Ensure compliance with security standards by implementing access controls, encryption, and logging mechanisms.
Job Requirements :
10-12 years of experience in data engineering or a related field.
Strong understanding of GCP services including BigQuery, Pub/Sub, Dataflow, etc.
Proficiency in Python programming language for scripting tasks.
Job Classification
Industry: IT Services & ConsultingFunctional Area / Department: Data Science & AnalyticsRole Category: Data Science & Analytics - OtherRole: Data Science & Analytics - OtherEmployement Type: Full time