Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Data Engineer @ Optum

Home > DBA / Data warehousing

 Data Engineer

Job Description


Data Engineer Consultant APC383 - 27 (Individual Contributor)

Position Overview:
OHBI is seeking a highly skilled and experienced Data Engineer to join our team. The ideal candidate will have a strong background in programming, data management, and cloud infrastructure, with a focus on designing and implementing efficient data solutions. This role requires a minimum of 3 to 6 years of experience with a deep understanding of Azure services and infrastructure, ETL/ELT solutions, Snowflake, and artificial intelligence technologies. The candidate should have some knowledge of AI tools and applications to enhance workflows, automate tasks, and extract insights from data.

Key Responsibilities:

  • AI Integration and Development: Design, develop, and implement artificial intelligence systems, including the use of AI to create new software and AI-driven solutions. Apply AI-powered tools for data analysis, task automation, and decision-making.
  • Azure Infrastructure Management: Own and maintain all aspects of Azure infrastructure, recommending modifications to enhance reliability, availability, and scalability.
  • Security Management: Manage security aspects of Azure infrastructure, including network, firewall, private endpoints, encryption, PIM, and permissions management using Azure RBAC and Databricks roles.
  • Technical Troubleshooting: Diagnose and troubleshoot technical issues in a timely manner, identifying root causes and providing effective solutions.
  • Infrastructure as Code: Create and maintain Azure Infrastructure as Code using Terraform and GitHub Actions.
  • CI/CD Pipelines: Configure and maintain CI/CD pipelines using GitHub Actions for various Azure services such as ADF, Databricks, Storage, and Key Vault.
  • Programming Expertise: Utilize your expertise in programming languages such as Python to develop and maintain data engineering solutions.
  • Real-Time Data Streaming: Use Kafka for real-time data streaming and integration, ensuring efficient data flow and processing.
  • Data Management: Proficiency in Snowflake for data wrangling and management, optimizing data structures for analysis.
  • DBT Utilization: Build and maintain data marts and views using DBT, ensuring data is structured for optimal analysis.
  • ETL/ELT Solutions: Design ETL/ELT solutions using tools like Azure Data Factory and Azure Databricks, leveraging methodologies to acquire data from various structured or semi-structured source systems.
  • Communication: Strong communication skills to explain technical solutions and issues clearly in technical and non-technical terms ensuring understanding at the Engineering Lead (Delivery Owner) and Key Stakeholders (Business leaderships)

Qualifications:

  • Minimum of 3 to 6 years of experience in designing ETL/ELT solutions using tools like Azure Data Factory and Azure Databricks, Snowflake, Snowflake tasks and streams, Microsoft Data Fabric, Iceberg etc.
  • Knowledgeable in programming languages such as Python.
  • 1-2 years knowledge in LLMs, machine learning, data science, and programming languages.
  • 1-2 years knowledge using AI tools for data analysis, automation, and insight extraction from large datasets
  • Experience with Kafka for real-time data streaming and integration.
  • Proficiency in Snowflake for data wrangling and management.
  • Some level of understanding of dbt to build and maintain data marts and views.
  • In-depth understanding of managing security aspects of Azure infrastructure.
  • Experience in creating and maintaining Azure Infrastructure as Code using Terraform and GitHub Actions.
  • Ability to configure, set up, and maintain GitHub for various code repositories.
  • Experience in creating and configuring CI/CD pipelines using GitHub Actions for various Azure services.
  • Strong problem-solving skills and ability to diagnose and troubleshoot technical issues.

Excellent communication skills

Job Classification

Industry: Software Product
Functional Area / Department: Engineering - Software & QA
Role Category: DBA / Data warehousing
Role: Database Developer / Engineer
Employement Type: Full time

Contact Details:

Company: Optum
Location(s): Hyderabad

+ View Contactajax loader


Keyskills:   Snowflake ETL Data Bricks Github Aiml

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ 18-27.5 Lacs P.A

Similar positions

Senior, Software Engineer

  • Walmart
  • 1 - 10 years
  • Bengaluru
  • 8 days ago
₹ Not Disclosed

Software Engineer - Python Developer

  • Bahwan CyberTek
  • 5 - 7 years
  • Mumbai
  • 9 days ago
₹ Not Disclosed

Software Engineer - Python Developer

  • Bahwan CyberTek
  • 4 - 7 years
  • Mumbai
  • 9 days ago
₹ Not Disclosed

ETL AWS Glue - Senior Engineer

  • Iris Software
  • 5 - 10 years
  • Noida, Gurugram
  • 12 days ago
₹ Not Disclosed

Optum

With the playing field for software development radically altered by distributed computing applications that span from cloud to mobile to Internet of Things (IoT), efficiency is at a premium. OptumSoft streamlines the software writing process through a platform that replaces the ad hoc approach many...