Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Gcp Architect @ Datametica

Home > Software Development

 Gcp Architect

Job Description

We are seeking an experienced GCP Data Architect to design, build, and govern scalable, secure, and high-performance data platforms on Google Cloud Platform (GCP). The role requires deep expertise in cloud-native data architecture, ETL/ELT pipelines, analytics, and data warehousing, along with the ability to translate business requirements into robust technical solutions.

Key Responsibilities

  • Design end-to-end data architecture solutions on Google Cloud Platform
  • Define and implement scalable, secure, and cost-efficient data platforms
  • Architect and build ETL/ELT pipelines using Python and GCP-native services
  • Design and optimize data warehousing and analytics solutions using BigQuery
  • Develop and orchestrate data workflows using Cloud Composer (Airflow)
  • Implement large-scale data processing using Dataflow and Dataproc
  • Design event-driven and streaming architectures using Pub/Sub and Cloud Functions
  • Manage and optimize data storage using Cloud Storage
  • Implement Data Loss Prevention (DLP) for data security, privacy, and compliance
  • Design and manage SQL-based data models and schemas
  • Integrate and manage relational databases including MS SQL Server and PostgreSQL
  • Ensure best practices around data governance, security, performance, and reliability
  • Collaborate with stakeholders, engineering teams, and business users to align data architecture with business goals

Required Skills & Experience

  • Strong hands-on experience with Google Cloud Platform (GCP)
  • Proven experience as a Data Architect or Senior Data Engineer
  • Expertise in BigQuery and modern data warehousing concepts
  • Strong proficiency in SQL and Python
  • Experience building and maintaining cloud-based ETL/ELT pipelines
  • Hands-on experience with:
    • Cloud Storage
    • Pub/Sub
    • Cloud Functions
    • Cloud Composer
    • Dataflow
    • Dataproc
  • Strong experience with relational databases: MS SQL Server, PostgreSQL
  • Solid understanding of data modeling, schema design, and performance optimization
  • Knowledge of data security, compliance, and DLP implementation
  • Experience with large-scale, distributed data systems

Education

  • Bachelors degree in Computer Science, Information Systems, Engineering, or a related field
  • Equivalent practical experience will be considered

Nice to Have (Optional)

  • GCP Professional Data Engineer or Architect certification
  • Experience with streaming analytics and real-time data processing
  • Exposure to CI/CD, Infrastructure as Code, or DevOps practices on GCP

Job Classification

Industry: Software Product
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Technical Architect
Employement Type: Full time

Contact Details:

Company: Datametica
Location(s): Pune

+ View Contactajax loader


Keyskills:   GCP Cloud Migration Presales Data Warehousing Pubsub Bigtable Architecture Bigquery Data Lake ETL

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Senior Principal Software Architect - AI Applications

  • Opentext
  • 15 - 20 years
  • Hyderabad
  • 11 days ago
₹ Not Disclosed

Application Architect-Azure Cloud Migration

  • IBM
  • 3 - 8 years
  • Pune
  • 11 days ago
₹ Not Disclosed

Cloud Solutions Architect

  • Infosys
  • 10 - 14 years
  • Pune
  • 11 days ago
₹ Not Disclosed

Java Cloud Architect

  • Hexaware Technologies
  • 7 - 12 years
  • Mumbai
  • 11 days ago
₹ Not Disclosed

Datametica

DataMetica is the leader in Big Data architecture, Advanced Analytics and Big Data Operations focused on serving large global companies. We provide a fast and reliable integration of Hadoop and related technologies into enterprise operations. Our team is comprised of highly experienced Hadoop, noSQL...