Hadoop Administrator Job Vacancy in Hewlett Packard Enterprise Bengaluru, Karnataka – Updated today

Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for

Full Details :
Company Name :
Hewlett Packard Enterprise
Location : Bengaluru, Karnataka
Position :

Job Description : Hadoop Administrator
Hewlett Packard Enterprise advances the way people live and work. We bring together the brightest minds to create breakthrough technology solutions, helping our customers make their mark on the world.
Our new innovative IT services organization is HPE Pointnext. We have the expertise to advise, integrate, and accelerate our customers’ outcomes from their digital transformation.
Job Profile:
The primary function of support consultants is to provide in-depth technical and technological expertise in their particular areas of focus, so that we can develop a genuinely customized solution for every customer. Initially support consultants provide services to a limited number of small accounts to which they’re personally assigned, but support becomes more and more specific as you move through the experience levels, right up to the highly complex design, scoping and implementation leadership of international and strategic accounts, working at highest level with customers and managing large projects
Roles and Responsibilities:
Minimum 6+ years Relevant experience in Hadoop administrationThe most essential requirements are: They should be able to deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure name node high availability, schedule and configure it and take backups.General operational expertise such as good troubleshooting skills, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks.Hadoop skills like HBase, Hive, Pig, Mahout, etc.Good knowledge of Linux as Hadoop runs on Linux.Familiarity with open source configuration management and deployment tools such as Puppet or Chef and Linux scripting.Knowledge of Troubleshooting Core Java Applications is a plus.Responsible for implementation and ongoing administration of Hadoop infrastructure.Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.Cluster maintenance as well as creation and removal of nodes using tools like Cloudera Manager Enterprise, Dell Open Manage and other tools.Performance tuning of Hadoop clusters and Hadoop MapReduce routines.Screen Hadoop cluster job performances and capacity planningMonitor Hadoop cluster connectivity and securityManage and review Hadoop log files.File system management and monitoring.HDFS support and maintenance.Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.Collaborating with application teams to install the operating system and Hadoop updates, patches, version upgrades when required.Point of Contact for Vendor escalation
Join us and make your mark!
We offer:A competitive salary and extensive social benefitsDiverse and dynamic work environmentWork-life balance and support for career developmentAn amazing life inside the element! Want to know more about it?
Then let’s stay connected!
https://www.facebook.com/HPECareers

1110412
This role has been designated as ‘Edge’, which means you will primarily work outside of an HPE office

This post is listed Under  Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *