Your browser does not support javascript! Please enable it, otherwise web will not work for you.

GCP Data Engineer @ Royal Cyber

Home > Software Development

 GCP Data Engineer

Job Description

  • Eastern Time) Employment Type: Full-time Long-term Contract (Annual Renewal) Summary We are seeking a highly skilled and motivated Lead GCP Data Engineer to join our team
  • This role is critical to the development and operation of cloud-native, AI-driven enterprise data products that power global media planning and analytics
  • As a Senior Data Engineer, you will architect, build, and maintain scalable, secure, and optimized data solutions on Google Cloud Platform (GCP)
  • Your focus will be on developing robust ELT pipelines, streaming workloads, API-based ingestion frameworks, and orchestration using tools such as Apache Spark, Airflow (Cloud Composer), and BigQuery
  • You ll operate in a fast-paced environment, supporting data-driven innovation across cross-functional teams and ensuring reliability, compliance, and cost efficiency in all workflows
  • Key Responsibilities Data Engineering & Development Design, build, and optimize scalable ELT/ETL pipelines to process structured and unstructured data across batch and streaming systems
  • Architect and deploy cloud-native data workflows using GCP services including BigQuery, Cloud Storage, Cloud Functions, Cloud Pub/Sub, Dataflow, and Cloud Composer
  • Build high-throughput Apache Spark workloads in Python and SQL, with performance tuning for scale and cost
  • Develop parameterized DAGs in Apache Airflow with retry logic, alerting, SLA/SLO enforcement, and robust monitoring
  • Build reusable frameworks for high-volume API ingestion, transforming Postman collections into production-ready Python modules
  • Translate business and product requirements into scalable, efficient data systems that are reliable and secure
  • Cloud Infrastructure & Security Implement IAM and VPC-based security to manage and deploy GCP infrastructure for secure data operations
  • Ensure robustness, scalability, and cost-efficiency of all infrastructure, following FinOps best practices
  • Apply automation through CI/CD pipelines using tools like Git, Jenkins, or Bitbucket
  • Data Quality, Governance & Optimization Design and implement data quality frameworks, monitoring, validation, and anomaly detection
  • Build observability dashboards to ensure pipeline health and proactively address issues
  • Ensure compliance with data governance policies, privacy regulations, and security standards
  • Collaboration & Project Delivery Work closely with cross-functional stakeholders including data scientists, analysts, DevOps, product managers, and business teams
  • Effectively communicate technical solutions to non-technical stakeholders
  • Manage multiple concurrent projects, shifting priorities quickly and delivering under tight timelines
  • Collaborate within a globally distributed team with real-time engagement through 2 p
  • m
  • U
  • S
  • Eastern Time
  • Qualifications & Certifications Education Bachelor s or Master s degree in Computer Science, Information Technology, Engineering, or a related field
  • Experience Minimum 7+ years in data engineering with 5+ years of hands-on experience on GCP
  • Proven track record with tools and services like BigQuery, Cloud Composer (Apache Airflow), Cloud Functions, Pub/Sub, Cloud Storage, Dataflow, and IAM/VPC
  • Demonstrated expertise in Apache Spark (batch and streaming), PySpark, and building scalable API integrations
  • Advanced Airflow skills including custom operators, dynamic DAGs, and workflow performance tuning
  • Certifications Google Cloud Professional Data Engineer certification preferred
  • Key Skills Mandatory Technical Skills Advanced Python (PySpark, Pandas, pytest) for automation and data pipelines
  • Strong SQL with experience in window functions, CTEs, partitioning, and optimization
  • Proficiency in GCP services including BigQuery, Dataflow, Cloud Composer, Cloud Functions, and Cloud Storage
  • Hands-on with Apache Airflow, including dynamic DAGs, retries, and SLA enforcement
  • Expertise in API data ingestion, Postman collections, and REST/GraphQL integration workflows
  • Familiarity with CI/CD workflows using Git, Jenkins, or Bitbucket
  • Experience with infrastructure security and governance using IAM and VPC
  • Nice-to-Have Skills Experience with Terraform or Kubernetes (GKE)
  • Familiarity with data visualization tools such as Looker or Tableau
  • Exposure to MarTech/AdTech data sources and campaign analytics
  • Knowledge of machine learning workflows and their integration with data pipelines
  • Experience with other cloud platforms like AWS or Azure
  • Soft Skills Strong problem-solving and critical-thinking abilities
  • Excellent verbal and written communication skills to engage technical and non-technical stakeholders
  • Proactive and adaptable, with a continuous learning mindset
  • Ability to work independently as well as within a collaborative, distributed team
  • Working Hours Must be available for real-time collaboration with U
  • S
  • stakeholders every business day through 2 p
  • m
  • U
  • S
  • Eastern Time (minimum 4-hour overlap)

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Data Engineer
Employement Type: Full time

Contact Details:

Company: Royal Cyber
Location(s): Chennai

+ View Contactajax loader


Keyskills:   Computer science Performance tuning Automation Machine learning Workflow Data quality Information technology Monitoring SQL Python

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Python AI Engineer

  • TEKsystems
  • 5 - 8 years
  • Hyderabad
  • 8 days ago
₹ Not Disclosed

Software Development Engineer, Data Collection Technology

  • Morningstar
  • 2 - 5 years
  • Mumbai
  • 9 days ago
₹ Not Disclosed

Mobile DevOps Engineer

  • Valuelabs
  • 7 - 12 years
  • Dubai
  • 10 days ago
₹ Not Disclosed

Custom Software Engineer

  • Accenture HR Aditi
  • 3 - 8 years
  • Noida, Gurugram
  • 11 days ago
₹ Not Disclosed

Royal Cyber

Naukri Premium - Employer Services