Hadoop Architect Job Vacancy in Securonix, Inc. Pune, Maharashtra – Updated today
Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for
Full Details :
Company Name : Securonix, Inc.
Location : Pune, Maharashtra
Position :
Job Description : Description
Securonix provides the Next Generation Security and Information Event Management (SIEM) solution. As a recognized leader in the SIEM industry, Securonix helps some of the largest organizations globally to detect sophisticated cyberattacks and rapidly respond to these attacks within minutes. With the Securonix SNYPR platform, organizations can collect billions of events each day and analyze them in near real time to detect advanced persistent threats (APTs), insider threats, privilege account misuses and online fraud.
Securonix pioneered the User and Entity Behavior Analytics (UEBA) market and holds patents in the use of behavioral algorithms to detect malicious activities. The Securonix SNYPR platform is built on big data Hadoop technologies and is infinitely scalable. Our platform is used by some of the largest organizations in the financial, healthcare, pharmaceutical, manufacturing, and federal sectors.
Role Summary:-
The Hadoop Architect will be responsible for sizing, HA and DR setup, configuration and tuning and scale-out troubleshooting for Sev-1 issues and other very high CEO visible aspects of Hadoop tools and platforms as well as the design of optimization and cloud-native enhancements and cost controls, someone who is an expert in one of knowledge in AWS, Microsoft Azure or Google Cloud is required and Experience with Kafka, Cloudera Hbase and Yarn, EMR HBase, EMR Yarn and Solr is desirable.
The person has to have architect acumen but should have passion for SRE and operations work also because a lot of architecture of Hadoop emerges from the problems you run into managing the environment operationally.
Responsibilities:-
Adding/removing the servers in the availability set/load balancer.
Implement storage encryption, application gateway, local and virtual gateways, best practices from vendor
Ability to learn deep knowledge of our complex applications.
Assist in the roll-out and deployment of new product features and installations to new cloud infrastructure our rapid iteration and constant growth.
Develop tools to improve our ability to rapidly deploy and effectively monitor custom applications in a large-scale UNIX environment.
Deploy, operate, maintain, secure and administer solutions that contribute to the operational efficiency, availability, performance and visibility of our customers’ infrastructure and Hadoop platform services, across multiple vendors (i.e. Cloudera, Hortonworks, EMR, Databricks, HDInsights etc)
Gather information and provide performance and root cause analytics and remediation planning for faults, errors, configuration warnings and bottlenecks within our Hadoop ecosystems.
Deliver well-constructed, explanatory technical documentation for architectures that we develop, and plan service integration, deployment automation and configuration management to business requirements within the infrastructure and Hadoop ecosystem.
Knowledge and Skills Requirements:
15+ years of overall experience Including 5+ years of Bigdata experience.
Minimum of 3+ years experience on cloud platforms like AWS, Azure Infrastructure, or GCP.
Experience in Hadoop and its ecosystem tools like HDFS, YARN, Hbase, Solr and Kafka.
Must know at least 2 of the following tools: Kafka, Solr, HBase, EMR Yarn, Cloudera Yarn.
Experience in AWS services like Ec2, VPC, S3, RDS, Elastic cache, Athena.
Extensive experience in provisioning, configuring resources, storage accounts, resource groups, Security ports
Hands-on experience on Linux administration and troubleshooting (CentOS 7. x, Red Hat 7. x)
Strong understanding across Cloud and infrastructure components (server, storage, network, data, and applications) to deliver end to end Cloud Infrastructure architectures and designs.
Knowledge of related Cloud technologies (Azure, AWS, GCP)
Passionate, persuasive, articulate Cloud professional capable of quickly establishing interest and credibility in how to design, deploy and operate cloud-based Architectures.
Ability to work with team members from around the globe/experience working with off-shore models.
Strong knowledge for auto-scaling and auto-healing for Bigdata and Hadoop components
A proactive approach to problem-solving and identifying improvements.
Must possess strong written and verbal communication skills and must be capable of understanding, documenting, communicating and presenting technical issues.
Securonix, Inc. provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, gender, sexual orientation, gender identity, national origin, age, disability, genetic information, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state and local laws. Securonix complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.
Securonix expressly prohibits any form of unlawful employee harassment based on race, color, religion, gender, sexual orientation, national origin, age, genetic information, disability or veteran status. Improper interference with the ability of Securonix employees to perform their expected job duties is absolutely not tolerated.
This post is listed Under Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company