Associate Tech Specialist Job Vacancy in Tech mahindra Noida, Uttar Pradesh – Updated today
Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for
Full Details :
Company Name : Tech mahindra
Location : Noida, Uttar Pradesh
Position :
Job Description : Skill Set: Big Data Analytics, Total Experience: 8.00 to 8.00 Years Job Post Date: 10/02/2022 Job Expiry Date: 12/03/2022 Domain: IT Location: NOIDA [India]
Job Summary
A. Big Data(DevOps) 1. Responsibilities ¿ Designing and implementing Big Data solutions to leverage a Kubernetes cluster. ¿ Configuring hardware, peripherals, and services, managing settings and storage, deploying cloud native applications, and monitoring and supporting a Kubernetes environment. ¿ Deploying a hadoop cluster, maintaining a hadoop cluster, adding and removing nodes using cluster monitoring tools like Kubernetes Cluster Manager, Ambari Manager & Apache Airflow manager, configuring the NameNode high availability and keeping a track of all the running Big Data jobs. ¿ Implementing, managing and administering the overall hadoop infrastructure. ¿ Takes care of the day to day running of Hadoop clusters ¿ A hadoop administrator will have to work closely with the database team, network team, BI team and application teams to make sure that all the big data applications are highly available and performing as expected. ¿ Hadoop admin is responsible for capacity planning and estimating the requirements for lowering or increasing the capacity of the hadoop cluster. ¿ Hadoop admin is also responsible for deciding the size of the hadoop cluster based on the data to be stored in HDFS. ¿ Ensure that the hadoop cluster is up and running all the time. ¿ Monitoring the cluster connectivity and performance. ¿ Manage and review Hadoop log files. ¿ Backup and recovery tasks ¿ Resource and security management ¿ Troubleshooting application errors and ensuring that they do not occur again. 2. Skills ¿ Excellent knowledge of UNIX/LINUX OS. ¿ Excellent knowledge of Cloud Technology and Microservices. ¿ Knowledge of cluster monitoring tools like K8s, Ambari, Ganglia, or Nagios. ¿ Knowledge of Troubleshooting Core Java Applications is a plus. ¿ Good understanding of OS concepts, process management and resource scheduling. ¿ Basics of networking, CPU, memory and storage. ¿ Good hold of shell scripting ¿ A knack of all the components in the Hadoop ecosystem like HDFS, Apache Hive, Apache HBase, Apache Airflow, Apache Nifi, Apache Kafka, Apache Spark etc.
This post is listed Under Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company
