Your browser does not support javascript! Please enable it, otherwise web will not work for you.

GCP Data Engineer with AI Development knowledge @ Tech Mahindra

Home > Other

 GCP Data Engineer with AI Development knowledge

Job Description

Location:

  • Any city in India

Job Description GCP Data Engineer India

Skillset

  • In your role as an Engineer, you will work with change initiatives across the organization to help them design solutions that meet our architecture principles and drive the Bank towards its desired target state.
  • You will work closely with the data modelers to implement various data ingestion and transformation patterns for the feeds coming in from core banking platforms to the warehousing system.
  • You will also design various job streams in Control-M or Apache airflow according to the requirements.
  • You will work closely with cloud engineers to design and develop the next generation of the data distribution solutions, leveraging GCP capabilities.
  • Work with business and technology stakeholders on all levels to understand interfacing application requirements, prioritization and gain required business signoffs.
  • Actively control test management with activities including scoping, determining test strategies, driving defect management, running status calls, meeting business stakeholders' expectations and gaining sign-offs.
  • Perform detailed technology analyses to highlight weaknesses and make recommendations for improvement.
  • Perform unit testing, support UAT testing, various periodic production release related activities and paperwork, post implementation checkouts, SDLC documentation ETC.

Domain or platform knowledge / experience:

  • Experience of 5+ years in the following areas:
  • Strong Programming skill in Python, SQL/PLSQL
  • Handson experience with Pyspark for large scale distributed data processing.
  • Solid understanding of Apache Airflow (DAG design, Scheduling, Orchestration)
  • Experience working with Google cloud platform especially
  • Big query
  • Postgres
  • Cloud Storage
  • Data Proc
  • Cloud composer
  • GKE
  • Knowledge of DevOps configuration management tools (TeamCity, Jenkins, uDeploy, Kubernetes, Maven etc)
  • Building scalable data processing solution using pyspark running on Dataproc/Dataflow
  • Stakeholder influencing & communications.

Additional Requirement:

  • GCP Data engineer with AI development knowledge

Good to have

  • Knowledge of terraform or Infrastructure as code.
  • Understanding of github workflow actions
  • Experience with Data quality Framework

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Other
Role Category: Other
Role: Other
Employement Type: Full time

Contact Details:

Company: Tech Mahindra
Location(s): Hyderabad

+ View Contactajax loader


Keyskills:   Airflow Pyspark Gke Cluster Bigquery Dataproc PLSQL Python

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Tele Sales Executive- WORLI, Mumbai

  • Leading Broking Firm
  • 1 - 6 years
  • Mumbai
  • 5 days ago
₹ 1-4 Lacs P.A.

EMI/EMC Testing in Automotive domain

  • FEV
  • 5 - 10 years
  • Pune
  • 6 days ago
₹ Not Disclosed

Design and Verification Engineer Lead

  • Capgemini
  • 7 - 12 years
  • Bengaluru
  • 6 days ago
₹ Not Disclosed

Urgent Hiring // Evaluation Engineer // Mumbai // Fresher

  • Cars24
  • 0 - 5 years
  • Mumbai
  • 6 days ago
₹ 1-3.5 Lacs P.A.

Tech Mahindra

Tech Mahindra Limited is an Indian multinational provider of information technology (IT), networking technology solutions and Business Process Outsourcing (BPO) to the telecommunications industry. Tech Mahindra is a US$4.2 billion company with over 117,000 employees across 90 countries. It provide...