Strong knowledge of Extraction Transformation and Loading (ETL) processes using frameworks like Azure Data Factory or Synapse or Databricks; establishing cloud connectivity between different systems like ADLS, ADF, Synapse, Databricks etc.
Candidates must possess hands on Power BI skills.
Candidates must have good understanding of Informatica.
Design and develop ETL processes based on functional and non-functional requirements in python /pysparkwithin Azure platform.
A minimum of 5 years' experience with large SQL data marts. Expert relational database experience, Candidate should demonstrate ability to navigate through massive volumes of data to deliver effective and efficient data extraction, design, load, and reporting solutions to business partners
Experience in troubleshooting and supporting large databases and testing activities; Identifying reporting, and managing database security issues, user access/management; Designing database backup, archiving and storage, performance tuning, ETL importing large volume of data extracted from multiple systems, capacity planning
Experience in TSQL programming along with Azure Data Factory framework and Python scripting
Financial institution data mart experience is an asset.
Flexible and willing to learn, can-do attitude is key
Strong verbal and written communication skills
Job Classification
Industry: IT Services & ConsultingFunctional Area / Department: IT & Information SecurityRole Category: IT Infrastructure ServicesRole: IT Infrastructure Services - OtherEmployement Type: Full time