4-8+ Years of overall experience with at least 4 years experience on Big Data Tools and Technologies such as Spark, Kafka, Flume, Sqoop, Hive, HDFS, Mapreduce, HBase etc.
Good understanding of cloud based SaaS, PaaS and IaaS solutions and experience in Deployment of Big data solution on cloud AWS, Azure, Google cloud.
Experience in handling Hybrid BI-DWH implementations i.e. using traditional and Big data Technologies
Experience in Big Data technologies Apache Hive, Apache Hue, Apache Flink, Apache Spark, Apache Parquet, Apache mesos
Experience in Apache Kafka
Development experience in any of the language Java, Scala, Python with Micro Services, event sourcing, KSQL, AWS DynamoDB, AWS RDS.
Strong Telecom domain experience.
Experience in Oracle GoldenGate for Big Data and Attunity Replicate (Change Data Capture) and Virtual Distributed Storage (Alluxio) Other technology / frameworks: HDFS, AVRO,
Implementation experience of Data Retention policies, error logging and handling mechanism, Re-startibilty options in case of job failure etc.
Security implementation including handing customer sensitive information and awareness of EU-GDPR compliance
Experience in Implementation of authentication and authorization mechanism for secure data access.
Education:
UG: B.Tech/B.E. - Any Specialization, Computers, Electronics/Telecommunication