Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Hadoop Administrator @ Smartavya Analytica

Home > Software Development

 Hadoop Administrator

Job Description

Job Title: Hadoop Administrator

Location: Chennai, India

Experience: 5 yrs of experience in IT, with At least 2+ years of experience with cloud and system administration. At least 3 years of experience with and strong understanding of big data technologies in Hadoop ecosystem Hive, HDFS, Map/Reduce, Flume, Pig, Cloudera, HBase Sqoop, Spark etc.


Company: Smartavya Analytica Private limited is a niche Data and AI company. Based in Pune, we are pioneers in data-driven innovation, transforming enterprise data into strategic insights. Established in 2017, our team has experience in handling large datasets up to 20 PBs in a single implementation, delivering many successful data and AI projects across major industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are leaders in Big Data, Cloud and Analytics projects with super specialization in very large Data Platforms.https://smart-analytica.comSMARTAVYA ANALYTICA Smartavya Analytica is a leader in Big Data, Data Warehouse and Data Lake Solutions, Data Migration Services and Machine Learning/Data Science projects on all possible flavours namely on-prem, cloud and migration both ways across platforms such as traditional DWH/DL platforms, Big Data Solutions on Hadoop, Public Cloud and Private Cloud.smart-analytica.com Empowering Your Digital Transformation with Data Modernization and AI


Job Overview: Smartavya Analytica Private Limited is seeking an experienced Hadoop Administrator to manage and support our Hadoop ecosystem. The ideal candidate will have strong expertise in Hadoop cluster administration, excellent troubleshooting skills, and a proven track record of maintaining and optimizing Hadoop environments.


Key Responsibilities:

  • Install, configure, and manage Hadoop clusters, including HDFS, YARN, Hive, HBase, and other ecosystem components.
  • Monitor and manage Hadoop cluster performance, capacity, and security.
  • Perform routine maintenance tasks such as upgrades, patching, and backups.
  • Implement and maintain data ingestion processes using tools like Sqoop, Flume, and Kafka.
  • Ensure high availability and disaster recovery of Hadoop clusters.
  • Collaborate with development teams to understand requirements and provide appropriate Hadoop solutions.
  • Troubleshoot and resolve issues related to the Hadoop ecosystem.
  • Maintain documentation of Hadoop environment configurations, processes, and procedures.

Requirement:

  • Experience in Installing, configuring and tuning Hadoop distributions.
  • Hands on experience in Cloudera.
  • Understanding of Hadoop design principals and factors that affect distributed system performance, including hardware and network considerations.
  • Provide Infrastructure Recommendations, Capacity Planning, work load management.
  • Develop utilities to monitor cluster better Ganglia, Nagios etc.
  • Manage large clusters with huge volumes of data
  • Perform Cluster maintenance tasks
  • Create and removal of nodes, cluster monitoring and troubleshooting
  • Manage and review Hadoop log files
  • Install and implement security for Hadoop clusters
  • Install Hadoop Updates, patches and version upgrades. Automate the same through scripts
  • Point of Contact for Vendor escalation. Work with Hortonworks in resolving issues
  • Should have Conceptual/working knowledge of basic data management concepts like ETL, Ref/Master data, Data quality, RDBMS
  • Working knowledge of any scripting language like Shell, Python, Perl
  • Should have experience in Orchestration & Deployment tools.

Academic Qualification: BE / B.Tech in Computer Science or equivalent along with hands-on experience in dealing with large data sets and distributed computing in data warehousing and business intelligence systems using Hadoop.

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: Software Development
Role: Big Data Engineer
Employement Type: Full time

Contact Details:

Company: Smartavya Analytica
Location(s): Chennai

+ View Contactajax loader


Keyskills:   Cloudera Hadoop Cluster Hdfs Cdp Hadoop Administration Etl Pipelines Pyspark Hadoop ecosystem Hadoop Kafka YARN Big Data Administration HBase Hive Apache Pig Sqoop Spark Oozie

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

Storage and Backup Administrator

  • Info Way Solutions
  • 6 - 10 years
  • Madurai
  • 21 days ago
₹ 10-20 Lacs P.A.

ServiceNow CMDB Administrator (L3)

  • Makonis
  • 7 - 12 years
  • Hyderabad
  • 29 days ago
₹ Not Disclosed

Oracle Database Administrator

  • Comviva Technology
  • 7 - 12 years
  • Noida, Gurugram
  • 2 mths ago
₹ Not Disclosed

Salesforce Administrator + Apex Developer opening only For Chennai

  • V2soft
  • 6 - 10 years
  • Sholinganallur
  • 3 mths ago
₹ Not Disclosed

Smartavya Analytica

Company DetailsSmartavya Analytica