Mandatory skills (8+ Years of experience in ETL development with 4+ Years on AWS
Pyspark scripting)
1. Experience deploying and running AWS-based data solutions using services or
products such as S3, Lambda, SNS, Cloud Step Functions.
2. Person should be strong in Pyspark
3. Hands on and working knowledge in Python packages like NumPy, Pandas, Etc
4. Experience deploying and running AWS-based data solutions using services or
products such as S3, Lambda, SNS, Cloud Step Functions. Sound knowledge in AWS
services is must.
5. Person should work as Individual contributor
6. Good to have familiar with metadata management, data lineage, and principles of
data governance.
Good to have:
1. Experience to process large set of data transformations both semi and structured
data
2. Experience to build data lake & configuration on delta tables.
3. Good experience with computing & cost optimization.
4. Understanding the environment and use case and ready to build holistic Data
Integration frame works.
5. Good experience in MWAA (airflow orchestration)
Soft skill:
1. Having good communication to interact with IT-Stake holders and Business.
2. Understand the pain point to delivery

We are a software development services company with thought leadership in engineering digital solutions. We enable your enterprise to be more engaging, insightful, predictive, and efficient by adopting the technology advancements of the digital revolution and by supporting you from ideati...