Hadoop Developer – Remote Job Vacancy in Zyte Bengaluru, Karnataka – Updated today
Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for
Full Details :
Company Name : Zyte
Location : Bengaluru, Karnataka
Position :
Job Description : About Us
At Zyte (formerly Scrapinghub), we eat data for breakfast and you can eat your breakfast anywhere and work for Zyte. Founded in 2010, we are a globally distributed team of over 190 Zytans working from over 28 countries who are on a mission to enable our customers to extract the data they need to continue to innovate and grow their businesses. We believe that all businesses deserve a smooth pathway to data
For more than a decade, Zyte has led the way in building powerful, easy-to-use tools to collect, format, and deliver web data, quickly, dependably, and at scale. And today, the data we extract helps thousands of organizations make smarter business decisions, secure competitive advantage, and drive sustainable growth. Today, over 2,000 companies and 1 million developers rely on our tools and services to get the data they need from the web
About the Job
As a DevOps engineer you’ll be responsible for managing our Hadoop/HBase clusters and our storage backend application that runs on top of these clusters.
We are looking for a DevOps engineer to join our Infrastructure team to help, maintain and improve our technology stack which includes Hadoop, HBase, ELK, Prometheus/Grafana and many more.
Our Hadoop/HBase cluster stores hundreds of terabytes of data and it’s one of the primary components that makes up our ScrapyCloud product meaning your work will have direct and visible impact in one of our star products.
Being such a key component it’s crucial to ensure data availability and performance when our customers access their data.
Roles & Responsibilities:
Deploying and monitoring of the Hadoop/HBase services.
Storage backend application development with Java and Python.
Plan and deploy software upgrades, both at application and Hadoop/HBase layers.
Write tools and scripts to automate common tasks in managing the Hadoop
clusters.
Troubleshooting Hadoop/HBase service issues.
Help improve monitoring and identify key performance metrics.
Proactively ensure service runs with minimal disruption.
Requirements
5+ years experience doing DevOps work (deploying, maintaining, monitoring, and application development) with Hadoop/HBase technology, in moderate to large scale clusters.
Deep understanding of HDFS and HBase internals.
Experience with Hadoop MapReduce development.
Solid experience in Linux troubleshooting, tuning, profiling, and monitoring.
Solid experience with Java, and understanding of the JVM.
Skills in the following programming languages: Python, Bash/Shell scripting.
Experience with Docker.
Understanding of load balancing and reverse proxying techniques.
Soft skills:
Highly organized, able to multitask, able to work individually, as well as within a team, and across teams
Strong oral and written communication skills in English
Flexibility around working hours – if there is an issue you should use your initiative and help resolve it.
Maintaining and respecting confidentiality of large amounts of information you have access to.
Bonus points:
Open source contributions related to HBase or other NoSQL databases.
Benefits
By joining the Zyte team, you will:
Become part of a self-motivated, progressive, multi-cultural team.
Have the freedom & flexibility to work remotely.
Get the chance to work with cutting-edge open source technologies and tools.
Allocated hours for Open Source contributions
35 paid time off per year
This post is listed Under Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company