Responsibilities:
- Integrating data seamlessly across systems using APIs and connectors
- Designing scalable data architectures to support analytical needs
- Building and maintaining data ingestion pipelines from databases, event streams, files, and APIs
- Implementing checks and validations to maintain data integrity
- Optimizing database performance and query execution in BigQuery
- Troubleshooting, debugging, and resolving data pipeline failures
- Building and maintaining data transformation pipelines and orchestrating the correct dependencies
- Implementing and improving alerting and monitoring for data observability
- Adhering to Peloton development and modeling standards, change management processes, and security/access policies
- Collaborating with cross-functional teams to ensure reliable, timely data delivery
- Managing data storage solutions, ensuring scalability, security, and compliance
- Developing documentation as needed
- Identifying data discrepancies with source systems and perform variance analysis
Tools
- DBT
- BigQuery
- Airflow
- Airbyte
- General GCP services (esepcially cloud runner, clound composer, Vertex, DLP)
Experience:
Engineer II = 3-4 years
Engineer III = 5-7 years
Senior = 8+ years

Keyskills: Airflow Bigquery DBT Gcp Cloud Python GCP Data Build Tool
Trigent is a leading provider of IT services and solutions, headquartered in Boston, USA, with development centers in India. We are committed to delivering high-quality software solutions and services to our clients across the globe. Our expertise spans a wide range of industries, and we take pri...