Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Data Architect @ Compunnel

Home > Software Development

Compunnel  Data Architect

Job Description

Role: Senior Data Architect
Experience: 12+ Years
Location: Noida / Remote

Mode: Full-time / Contract

Budget: 35 LPA


Responsibilities: Design and Strategy: Work with the data architecture team to define data architecture blueprints for our products, including data flow diagrams, system integrations, and storage solutions. Continuously refine the architecture to meet evolving business requirements and to incorporate new AWS capabilities and industry best practices.


Cloud Data Platform Development:

Lead the development of our cloud-based data platform on AWS. Implement data pipelines and warehouses using AWS services e.g., AWS Glue for ETL, AWS Lambda for serverless processing, Amazon Redshift for data warehousing, and S3 for data storage.


Big Data & Legacy Integration: Oversee the ingestion of large-scale datasets from various sources (transactional systems, APIs, external files). Optimize processing of big data using Spark and integrate legacy Hadoop-based data into our AWS environment.


Data Modeling: Develop and maintain data models (conceptual, logical, physical) for our databases and data lakes. Design relational schemas and dimensional models that cater to both operational applications and analytical workloads. Ensure data is organized for easy access and high performance (for example, optimizing Redshift schema design and using partitioning or sort keys appropriately).
Advanced Analytics Enablement: Work closely with Data Science and Analytics teams to enable AI and advanced analytics. Provide well-structured data sets and create pipelines that feed machine learning models (e.g., customer personalization models, predictive analytics). Implement mechanisms to handle real-time streaming data (using tools like Kinesis or Kafka if needed) and ensure data quality and freshness for AI use cases.
Efficiency and Scalability:

Design efficient, scalable processes for data handling. This includes optimizing ETL jobs (monitoring and tuning Glue/Spark jobs), implementing incremental data loading strategies instead of full loads where possible, and ensuring our data infrastructure can scale to growing data volumes. You will continually seek opportunities to automate manual data management tasks and improve pipeline reliability (CI/CD for data pipelines).


Data Governance & Security: Embed data governance into the architecture implement data cataloging, lineage tracking, and governance policies.

Ensure compliance with data privacy and security standards: implement access controls, encryption (at-rest and in-transit), and data retention policies aligned with Alight and client requirements. Work with the InfoSec team to perform regular audits of data access and to support features like data masking or tokenization for sensitive information.


Collaboration and Leadership:

Collaborate with other technology leadership and architects, product managers, business analysts, and engineering leads to understand data needs and translate them into technical solutions.

Provide technical leadership to data engineers set development standards, guide them in choosing the right tools/approaches, and conduct design/code reviews. Lead architecture review sessions and be the go-to expert for any questions on data strategy and implementation.

Innovation and Thought Leadership:

Stay abreast of emerging trends in data architecture, big data, and AI.

Evaluate and recommend new technologies or approaches (for example, evaluate the use of data lakehouses, graph databases, or new AWS analytics services). Provide thought leadership on how Alight can leverage data for competitive advantage, and pilot proofs-of-concept for new ideas.

Required Qualifications:

Experience: 10+ years (preferred 15-18 years) of experience in data architecture, data engineering, or related fields, with a track record of designing and implementing large-scale data solutions. Demonstrated experience leading data-centric projects from concept to production.

Hands-on experience in:

AWS Cloud & Big Data Expertise
Data Modeling & Warehousing

Programming & Scripting: Proficiency in programming for data engineering Python (or Scala/Java) for ETL/ELT scripting, and solid SQL skills for data manipulation and analysis.

Experience with infrastructure-as-code (Terraform/CloudFormation) and CI/CD pipelines for deploying data infrastructure is a plus.
Analytics and AI Orientation
Leadership & Soft Skills

Education: Bachelors degree in computer science, Information Systems, or a related field required. (Masters degree in a relevant field is a plus.)

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Technical Architect
Employement Type: Full time

Contact Details:

Company: Compunnel
Location(s): Noida, Gurugram

+ View Contactajax loader


Keyskills:   Data Architecture ETL AWS Python SQL Data Modeling

 Fraud Alert to job seekers!

₹ 2.5-35 Lacs P.A

Similar positions

GenAI / Agentic Architect / Lead

  • Cognizant
  • 8 - 10 years
  • Hyderabad
  • 16 hours ago
₹ Not Disclosed

Data Architect

  • Accenture
  • 12 - 17 years
  • Hyderabad
  • 2 days ago
₹ Not Disclosed

Ey - Gds Consulting - Ai And Data- Gen Ai - Senior Professional

  • EY
  • 2 - 6 years
  • Hyderabad
  • 7 hours ago
₹ Not Disclosed

Database Developer

  • Infogain
  • 8 - 13 years
  • Noida, Gurugram
  • 9 hours ago
₹ Not Disclosed

Compunnel

Compunnel Inc.