Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Pyspark Data Engineer @ Atyeti

Home > Software Development






 Pyspark Data Engineer

Job Description

Technical Skills

  • Must Have Skills:
    • Proficient with Python, PySpark and Airflow
    • Strong understanding of Object-Oriented Programming and Functional Programming paradigm
    • Must have experience working with Spark and its architecture
    • Knowledge of Software Engineering best practices
    • Advanced SQL knowledge (preferably Oracle)
    • Experience in processing large amounts of structured and unstructured data, including integrating data from multiple sources.
  • Good to Have Skills:
    • Knowledge of Data related AWS Services
    • Knowledge of GitHub and Jenkins
    • Automated testing

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Data Engineer
Employement Type: Full time

Contact Details:

Company: Atyeti
Location(s): Bengaluru

+ View Contactajax loader


Keyskills:   Airflow Pyspark Spark Python Data Engineering SQL

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Custom Software Engineer

  • Accenture
  • 5 - 10 years
  • Mumbai
  • 16 hours ago
₹ Not Disclosed

Lead Software Engineer

  • Capgemini
  • 5 - 8 years
  • Kanchipuram
  • 16 hours ago
₹ Not Disclosed

Databricks Solution Architect

  • Cognizant
  • 12 - 16 years
  • Chennai
  • 16 hours ago
₹ Not Disclosed

Senior Software Engineer

  • Capgemini
  • 4 - 7 years
  • Chennai
  • 17 hours ago
₹ Not Disclosed

Atyeti

Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 5...