Data Domain Architect Assoc Sr – DDO Modelling Job Vacancy in JPMorgan Chase Bank, N.A. Bengaluru, Karnataka – Updated today

Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for

Full Details :
Company Name :
JPMorgan Chase Bank, N.A.
Location : Bengaluru, Karnataka
Position :

Job Description : As a Software Engineer, you will work as a consultative team member. Solutions Architects function as big data team member for engagements to support the implementation of Cloud and other Big Data Hadoop clusters(HDFS, MapReduce frameworks), Hive, oozie and Spark. Engaging from Proof of Concept (POC) stages through to implementation of complex distributed production environment, you work collaboratively to successfully deploy and implement Python solutions in the production environment. You’ve got the technical depth to roll up your sleeves to work with Hadoop and Hive and the polish present best practices and other solutions as needed.
ResponsibilitiesDesigning and architecting solutions, scoping new engagements and implementations both short and long term, and guiding the team during product implementations.Resolving technical issues and advising on best practices for big data, Hadoop and Cloud environments.Driving successful installations of the product, configuration, tuning, and performance.Assisting with capacity planning for the environment to scale.Write and produce technical documentation.Being meticulous about tracking things and follow-through.Strong in Agile Development and experience with SCRUM or similar methodologies
Qualifications :
1) 5+ Years of experience performing various roles in Strategic Planning, Software development and Solution Architecting.
2) 2+ years of experience in Cloud platform (AWS) .
3) 3+ Years of development experience with Big Data Hadoop cluster(HDFS, MapReduce frameworks), Hive Oozie and Spark.
4) 5+ Years of programming experience in Python, SAS with strong database architecture knowledge
5) Hands on experience on AWS cloud services (VPC, EC2, S3, RDS, Redshift, DMS, EMR, Data Pipeline, CloudFormation, Lambda, RDS, SNS, SQS and more).
6) BS or higher in Computer Sciences or related field
7) Experience in Python, Pyspark and other cloud related technologies

8) Being meticulous about tracking things and follow-through.
9) Knowledge of complex data pipelines and data transformation
10) Ability to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based deployments
JPMorgan Chase & Co., one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management.
We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. In accordance with applicable law, we make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as any mental health or physical disability needs.

This post is listed Under  Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *