Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Azure Data Engineer @ Accion Labs

Home > Software Development

 Azure Data Engineer

Job Description

Role Snapshot

  • Title: Senior Microsoft Fabric Data Engineer
  • Experience: 8+ years in Data Engineering (minimum 4+ years on Azure, and ( Or )6 months to 1+ year with Microsoft Fabric)
  • Tech Focus: Microsoft Fabric, Azure Data Factory (ADF), Databricks (Python, PySpark, Spark SQL), Delta Lake, Power BI (DAX), Azure Storage, Lakehouse, Warehouse
  • Engagement: Client-facing, hands-on, design-to-delivery

Must-Have Skills (Strong, Hands-On)

  • Microsoft Fabric (2024+)
    OneLake, Lakehouse, Warehouse, Pipelines, Dataflows Gen2, Notebooks, capacities, workspace & item security, RLS/OLS.
  • Azure Data Factory (ADF)
    Reusable, parameterized pipelines; high-level orchestration; robust scheduling, logging, retries, and alerts.
  • Databricks (5+ years on Azure)
    • Python, PySpark, Spark SQL: complex transformations, joins, window functions, UDFs/UDAs.
    • Complex & nested notebooks; modular code with %run / dbutils.notebook.run.
    • Structured Streaming: watermarks, triggers, checkpointing, foreachBatch, schema evolution.
    • Delta Lake: Z-ORDER, OPTIMIZE/VACUUM, MERGE for SCD, Auto Optimize, compaction, time travel.
    • Performance tuning: partitioning, file sizing, broadcast hints, caching, Photon (where available), cluster sizing/pools.
  • Medallion Architecture
    Bronze/Silver/Gold patterns, SCD (Type 1/2), handling late-arriving dimensions.
  • Azure Storage
    ADLS Gen2 (hierarchical namespace), tiering (Hot/Cool/Archive), lifecycle & cost optimization, shortcuts into OneLake.
  • Data Warehousing
    Dimensional modeling, fact/aggregate design, query performance tuning in Fabric Warehouse & Lakehouse SQL endpoint.
  • SQL
    Excellent SQL development; advanced joins, windowing, CTEs, performance tuning/indexing where applicable.
  • Power BI (DAX)
    Awareness of Power BI and DAX; RLS alignment with Warehouse/Lakehouse.
  • Security & Compliance
    RBAC, item-level permissions, credentials for data sources, RLS/OLS, secret management (Key Vault), PII handling.
  • ETL/ELT Methodologies
    Robust, testable pipelines; idempotency; error handling; data quality gates.
  • Ways of Working
    Agile delivery, client-facing communication, crisp demos, documentation, and best-practice advocacy.

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Data Engineer
Employement Type: Full time

Contact Details:

Company: Accion Labs
Location(s): Pune

+ View Contactajax loader


Keyskills:   Adls Gen2 Azure Data Factory Microsoft Fabric PySpark Data Bricks Azure Synapse Warehouse Lakehouse

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Python AI Engineer

  • TEKsystems
  • 5 - 8 years
  • Hyderabad
  • 3 days ago
₹ Not Disclosed

Software Development Engineer, Data Collection Technology

  • Morningstar
  • 2 - 5 years
  • Mumbai
  • 4 days ago
₹ Not Disclosed

Mobile DevOps Engineer

  • Valuelabs
  • 7 - 12 years
  • Dubai
  • 5 days ago
₹ Not Disclosed

Custom Software Engineer

  • Accenture HR Aditi
  • 3 - 8 years
  • Noida, Gurugram
  • 5 days ago
₹ Not Disclosed

Accion Labs

Accion labs India Private Limited Accion Labs is a Software Development company with offering a full range of product life-cycle services in the emerging technology segment. This includes Web 2.0, Open Source, SaaS/Cloud, Mobility, IT Operations Management/ITSM, Big Data and traditional BI/DW. ...