SENIOR SOFTWARE ENGINEER/Group Data Technology Job Vacancy in HSBC Hyderabad, Telangana – Updated today

Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for

Full Details :
Company Name :
HSBC
Location : Hyderabad, Telangana
Position :

Job Description : The health and safety of our employees and candidates is very important to us. Due to the current situation related to the Novel Coronavirus (2019-nCoV), we’re leveraging our digital capabilities to ensure we can continue to recruit top talent at the HSBC Group. As your application progresses, you may be asked to use one of our digital tools to help you through your recruitment journey. If so, one of our Resourcing colleagues will explain how our video-interviewing technology will be used throughout the recruitment process and will be on hand to answer any questions you might have.
Some careers shine brighter than others.
If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organizations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions.
We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer
Job Profile:
Capable of reviewing and accepting/ challenging solutions provided by product vendors for platform optimization and root cause analysis tasks
Experience in doing product upgrades of the core big data platform, cluster expansion, setting up High availability for core services
Good knowledge of Hive as a service, Hbase, Kafka, Spark
Knowledge of basic data pipeline tools like Sqoop, File ingestion, Distcp and their optimal usage patterns
Knowledge of the various file formats and compression techniques used within HDFS and ability to recommend right patterns based on application use cases
Exposure to Amazon Web services (AWS) and Google cloud platform (GCP) services relevant to big data landscape, their usage patterns and administration
Working with application teams and enabling their access to the clusters with the right level of access control and logging using Active directory (AD) and big data tools
Setting up disaster recovery solutions for clusters using platform native tools and custom code depending on the requirements
Configuring Java heap and allied parameters to ensure all Hadoop services are running at their optimal best
Significant experience on Linux shell scripting, Python or perl scripting
Experience with industry standard version control tools (Git, GitHub, Subversion) and automated deployment, testing tools (Ansible, Jenkins, Bamboo etc)
Worked on projects with Agile/ Devops as the product management framework, good understanding of the principles and ability to work as part of the POD teams
Working knowledge of open source RDBMS – MySQL, Postgres, Maria DB
Ability to go under the hood for Hadoop services (Ambari, Ranger etc) that use DB as the driver
Requirements
Good experience in administration of Big data platform and the allied toolset. Big data platform software from Cloudera.
Working knowledge of Hortonworks Data flow (HDF) architecture, setup and ongoing administration
Has experience working on secured environments using a variety of technologies like Kerberos, Knox, Ranger, KMS, Encryption zone, Server SSL certificates
Prior experience of Linux system administration
Good experience of Hadoop capacity planning in terms of HDFS file system, Yarn resources
Good stakeholder management skills able to engage in formal and casual conversations and driving the right decisions
Good troubleshooting skills, able to identify the specific service causing issues, reviewing logs and able to identify problem entries and recommend solution working with product vendor
4+ years professional software administration experience and at least 2 years within Big data environment
Agile and SDLC experience – at least 2+ years
Ab-initio, Pentaho ETL tools implementation and administration
Good knowledge of ANSI standard SQL and optimization
Working knowledge of ingestion batch and data lake management
Change data capture tools like Attunity, IBM CDC – Implementation and administration
Contribution to Apache open source, Public github repository with sizeable big data operations and application code base
You’ll achieve more when you join HSBC.
www.hsbc.com/careers
HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.
Issued by – HSBC Software Development India

This post is listed Under  Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company

Similar Posts