Big Data Analyst Job Vacancy in StatusNeo Technology Consulting Gurgaon, Haryana – Updated today

Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for

Full Details :
Company Name :
StatusNeo Technology Consulting
Location : Gurgaon, Haryana
Position :

Job Description : Striving for excellence is in our DNA.
We are more than just specialists; we are experts in agile software development with a keen focus on Cloud Native D3 (Digital, Data, DevSecOps. We help leading global businesses to imagine, design, engineer, and deliver software and digital experiences that change the world.
Description
Headquartered in Princeton, NJ (United States) we are a multinational company that is growing fast. This role is based out of our India setup.
We believe that we are only as good as the quality of our people. Our offices are digital pods. Our clients are fortune brands. We’re always looking for the most talented and skilled teammates. Do you have it in you?
About The Role
As a Big Data Analyst, you will be complete expertise in executing Data Engineering and Data Analytics projects from scratch for Fortune companies. You will be working in close collaboration with the business, as well as other teams across StatusNeo paying special attention to solutions’ architecture and code quality.
We offer you a great opportunity to work on cutting edge projects and enhance your knowledge base. You level up your technical skills while performing lots of challenging and interesting tasks.
Responsibilities
Keep abreast of technological advancement, emerging standards, and new software solutions that may affect decisions about system builds or enhancements.
Work within a team of developers to complete proposed initiatives, contributing application architecture, and implementation guidance.
Quickly gain an understanding of our clients’ requirements, technology needs, and solution architecture.
You will be working with fine-tuning applications written in Spark and Hive and to improve the overall performance of the pipelines.
Collaborate with clients and internal teams to develop appropriate solutions.
Implementing end to end data pipelines for serving reporting and data science capabilities.
You will be writing complex hive queries using analytical functions.
Brainstorm with team members and prove the ability to think on the fly.
Requirements
Java developer skills (4+ years of experience)
Big Data Ecosystem: Spark, MapReduce, HDFS, HIVE, HBase, Pig, Sqoop, Flume, Oozie, Zookeeper, Spark, Hue, Cloudera (CDH), Hortonworks(HDP)
Cloud Services: EC2, EMR, S3, Redshift, Athena, AWS ECS, Terraform, AWS Cloud Formation, AWS Cloud Watch.
Good knowledge of object-oriented/ scripting language: Python, SQL, HIVE.
Experience with big data tools Hortonworks / Cloudera.
Experience in tools like Apache Nifi, HIVE, SAP Data Services is required.
Experience with the integration of different data sources with Data Lake is required.
Relational Databases: Oracle 12c, MySQL, MS-SQL Server
NoSQL Databases: HBase, Cassandra, and MongoDB
Architecture experience with Spark, AWS, and Big Data.
Should be competent enough to write Scala algorithms and JUnit test cases
Well versed in writing complex hive queries using analytical functions.
Knowledge in writing custom UDF in Hive to support customer’s business requirements.
Active team player with excellent interpersonal skills, a keen learner with self-commitment & innovation.
Experience in fine-tuning applications written in Spark and Hive and to improve the overall performance of the pipelines.
Hands-on experience in AWS Cloud and related tools, storage, and architectural aspects are required.
Experience in developing solutions on Hadoop/Spark platform, preferably with a data background Data Modelling, SQL programming, basic stuff on DWH.
Good To Have
Git (1-2 years of experience)
Develop test cases, conduct SIT, load test, and tune the performance of systems to meet acceptance criteria.
Perform and manage stages of the SCRUM and participate in the systems review with Project Lead/Manager.
Experience in the development of cloud-native applications is an added advantage.
Experience working in Agile teams
Web Technologies: JavaScript, CSS, HTML and JSP.
Operating Systems: Windows, UNIX/Linux, and Mac OS.
Build Management Tools: Maven, Ant.
IDE & Command line tools: Eclipse, IntelliJ, Toad, and NetBeans.
What We Offer
National and International Business Trips (if there is an opportunity)
Culture of Knowledge Sharing and Training
Modern & lively working environment
Opportunity to write books, participate in conferences
International assignment
Relocation opportunities

This post is listed Under  Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *