Senior Software Engineer/Wealth & Personal Banking IT Job Vacancy in HSBC Pune, Maharashtra – Updated today

Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for

Full Details :
Company Name :
HSBC
Location : Pune, Maharashtra
Position :

Job Description : Position:- GCP Devops.
Job Description:-
Understand and translate business needs into data models supporting long-term solutions. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning).
Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models
Define and govern data modelling and design standards, tools, best practices, and related development for enterprise data models.
Create logical and physical data models using best practices to ensure high data quality and reduced redundancy
Optimize and update logical and physical data models to support new and existing projects
Maintain conceptual, logical and physical data models along with corresponding metadata
Develop best practices for standard naming conventions and coding practices to ensure consistency of data models
Recommend opportunities for reuse of data models in new environments
Perform reverse engineering of physical data models from databases and SQL scripts
Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization
Evaluate data models and physical databases for variances and discrepancies
Validate business data objects for accuracy and completeness
Analyse data-related system integration challenges and propose appropriate solutions
Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks
Guide Data Analysts, Data Engineers, Visualisation Developers and others on project limitations and capabilities, performance requirements and interfaces
Review modifications to existing models to improve efficiency and performance
Examine new application design and recommend corrections if required
Requirements
Skillset:-
Experience with Big Data and Cloud Technologies – Hadoop, Hive, PySpark, GCP
Hands-on data modelling experience – relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols)
Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-centre contexts required
Good knowledge of metadata management, data modelling, and related tools (Visual Paradigm / Erwin) required
Well versed with the use of Agile tools – Jira and Confluence

This post is listed Under  Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company

Similar Posts