Eastern Time) Employment Type: Full-time Long-term Contract (Annual Renewal) Summary We are seeking a highly skilled and motivated Lead GCP Data Engineer to join our team
This role is critical to the development and operation of cloud-native, AI-driven enterprise data products that power global media planning and analytics
As a Senior Data Engineer, you will architect, build, and maintain scalable, secure, and optimized data solutions on Google Cloud Platform (GCP)
Your focus will be on developing robust ELT pipelines, streaming workloads, API-based ingestion frameworks, and orchestration using tools such as Apache Spark, Airflow (Cloud Composer), and BigQuery
You ll operate in a fast-paced environment, supporting data-driven innovation across cross-functional teams and ensuring reliability, compliance, and cost efficiency in all workflows
Key Responsibilities Data Engineering & Development Design, build, and optimize scalable ELT/ETL pipelines to process structured and unstructured data across batch and streaming systems
Architect and deploy cloud-native data workflows using GCP services including BigQuery, Cloud Storage, Cloud Functions, Cloud Pub/Sub, Dataflow, and Cloud Composer
Build high-throughput Apache Spark workloads in Python and SQL, with performance tuning for scale and cost
Develop parameterized DAGs in Apache Airflow with retry logic, alerting, SLA/SLO enforcement, and robust monitoring
Build reusable frameworks for high-volume API ingestion, transforming Postman collections into production-ready Python modules
Translate business and product requirements into scalable, efficient data systems that are reliable and secure
Cloud Infrastructure & Security Implement IAM and VPC-based security to manage and deploy GCP infrastructure for secure data operations
Ensure robustness, scalability, and cost-efficiency of all infrastructure, following FinOps best practices
Apply automation through CI/CD pipelines using tools like Git, Jenkins, or Bitbucket
Data Quality, Governance & Optimization Design and implement data quality frameworks, monitoring, validation, and anomaly detection
Build observability dashboards to ensure pipeline health and proactively address issues
Ensure compliance with data governance policies, privacy regulations, and security standards
Collaboration & Project Delivery Work closely with cross-functional stakeholders including data scientists, analysts, DevOps, product managers, and business teams
Effectively communicate technical solutions to non-technical stakeholders
Manage multiple concurrent projects, shifting priorities quickly and delivering under tight timelines
Collaborate within a globally distributed team with real-time engagement through 2 p
m
U
S
Eastern Time
Qualifications & Certifications Education Bachelor s or Master s degree in Computer Science, Information Technology, Engineering, or a related field
Experience Minimum 7+ years in data engineering with 5+ years of hands-on experience on GCP
Proven track record with tools and services like BigQuery, Cloud Composer (Apache Airflow), Cloud Functions, Pub/Sub, Cloud Storage, Dataflow, and IAM/VPC
Demonstrated expertise in Apache Spark (batch and streaming), PySpark, and building scalable API integrations
Advanced Airflow skills including custom operators, dynamic DAGs, and workflow performance tuning
Certifications Google Cloud Professional Data Engineer certification preferred
Key Skills Mandatory Technical Skills Advanced Python (PySpark, Pandas, pytest) for automation and data pipelines
Strong SQL with experience in window functions, CTEs, partitioning, and optimization
Proficiency in GCP services including BigQuery, Dataflow, Cloud Composer, Cloud Functions, and Cloud Storage
Hands-on with Apache Airflow, including dynamic DAGs, retries, and SLA enforcement
Expertise in API data ingestion, Postman collections, and REST/GraphQL integration workflows
Familiarity with CI/CD workflows using Git, Jenkins, or Bitbucket
Experience with infrastructure security and governance using IAM and VPC
Nice-to-Have Skills Experience with Terraform or Kubernetes (GKE)
Familiarity with data visualization tools such as Looker or Tableau
Exposure to MarTech/AdTech data sources and campaign analytics
Knowledge of machine learning workflows and their integration with data pipelines
Experience with other cloud platforms like AWS or Azure
Soft Skills Strong problem-solving and critical-thinking abilities
Excellent verbal and written communication skills to engage technical and non-technical stakeholders
Proactive and adaptable, with a continuous learning mindset
Ability to work independently as well as within a collaborative, distributed team
Working Hours Must be available for real-time collaboration with U
S
stakeholders every business day through 2 p
m
U
S
Eastern Time (minimum 4-hour overlap)
Job Classification
Industry: IT Services & ConsultingFunctional Area / Department: Engineering - Software & QARole Category: Software DevelopmentRole: Data EngineerEmployement Type: Full time