Job Description
We are seeking a skilled and detail-oriented DBT Developer to join our cross-functional Agile team In this role, you will be responsible for designing, building, and maintaining modular, reliable data transformation pipelines using dbt (Data Build Tool) in a Snowflake environment You will collaborate closely with backend and frontend engineers, product managers, and analysts to create analytics-ready data models that power application features, reporting, and strategic insights This is an exciting opportunity for someone who values clean data design, modern tooling, and working at the intersection of engineering and business Key Responsibilities Design, build, and maintain scalable, modular dbt models and transformation pipelines Write SQL to transform raw data into curated, tested datasets in Snowflake Collaborate with full-stack developers and UI/UX engineers to support application features that rely on transformed datasets Work closely with analysts and stakeholders to gather data requirements and translate them into reliable data models Enforce data quality through rigorous testing, documentation, and version control in dbt Participate in Agile ceremonies (eg, stand-ups, sprint planning) and manage tasks using Jira Integrate dbt into CI/CD pipelines and support automated deployment practices Monitor data performance and pipeline reliability,and proactively resolve issues
Mandatory Qualifications & Skills 3 5 years of experience in data engineering or analytics engineering, with a focus on SQL-based data transformation Hands-on production experience using dbt as a primary development tool Strong command of SQL and solid understanding of data modeling best practices (eg, star/snowflake schema) Proven experience with Snowflake as a cloud data warehouse Familiarity with Git-based version control workflows Strong communication and collaboration skills, with the ability to work across engineering and business teams Experience working in Agile/Scrum environments and managing work using Jira Nice-to-Have Skills Exposure to CI/CD pipelines and integrating dbt into automated workflows Experience with cloud platforms such as AWS Familiarity with Docker and container-based development Knowledge of data orchestration tools (eg, Airflow, Dagster, Prefect) Understanding of how data is consumed in downstream analytics tools (eg, Looker, Tableau, Power BI) Basic Python skills for data pipeline integration or ingestion
Preferred Experience A track record of building and maintaining scalable dbt projects in a production setting Experience working in cross-functional teams involving developers, analysts, and product managers A strong sense of ownership, documentation habits, and attention to data quality and performance
Job Classification
Industry: Banking
Functional Area / Department: Engineering - Software & QA
Role Category: DBA / Data warehousing
Role: Data warehouse Developer
Employement Type: Full time
Contact Details:
Company: Capco
Location(s): Pune
Keyskills:
Backend
Version control
Data modeling
Data quality
Capital market
JIRA
Analytics
Financial services
SQL
Python