Data Architect Job Vacancy in Johnson Controls Pune, Maharashtra – Updated today
Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for
Full Details :
Company Name : Johnson Controls
Location : Pune, Maharashtra
Position :
Job Description : Data Architect – WD30116102138
What you will do
As a Data Architect in OpenBlue you will define, optimise and govern the specialised technology landscape for the Data Platform. The role has a truly global scope and is accountable for providing a world class and competitively differentiating data technology architecture for Johnson Control.
The scope of this role will give you an opportunity to leverage your specialist skills within data platforms and data engineering architectures to apply to the full breadth of the data technology landscape required to add value to the OpenBlue platform. Your deep skills and technical fluency in this specialist area will help to equip the company with a data technology platform which supports the growth ambitions of the business and delivers a competitive advantage through data technology.
This is a rewarding and challenging technical role that requires deep knowledge and expertise to drive and deliver a data platform to support the Business strategy. You will work in a highly skilled team of pears in a highly capable team of engineers and use your specialist knowledge to produce optimised designs and clear content to those teams to follow.
How you will do it
Work collaboratively with all levels of stakeholders to architect, implement and test Big Data based analytical solution from disparate sources
Demonstrate an understanding of concepts, best practices and functions to implement a Big Data solution in an enterprise environment
Establish, maintain and adhere to Enterprise Data Standards & Policies.
Create and document Logical & Physical Data Models using best practices to ensure high data quality and reduced redundancy.
Recommend solutions to improve existing functional data models.
Understand and translate business needs into scalable data models supporting enterprise views
Optimize and update logical and physical data models to support new and existing projects.
Maintain conceptual, logical and physical data models along with corresponding metadata.
Develop best practices for standard naming conventions and coding practices to ensure consistency of data models.
Ensure reusability of data models for future business requirements
Perform reverse engineering of physical data models from databases and SQL scripts.
Connect with the functional and technical leads when prioritizing the technology initiatives to support analytics
Information Security Responsibilities
Adhere to JCI policies, guidelines and procedures pertaining to the protection of information assets.
Adhere to JCI product security standards and guidelines for secured software product development.
The software, licenses, shall be deployed and used only in accordance with JCI licensing agreements.
Unless provided in the applicable license, notice, or agreement, copyrighted software shall not be duplicated, except for backup and archival purposes.
Any software that is acquired illegally or does not have a valid license shall not be deployed or used. Copying of third party materials without an appropriate license is prohibited.
The employees, contractors or third-party personnel shall not copy, extract or reproduce in any way, copyrighted material from the Internet on information systems provided by organization, other than permitted by copyright law.
Implement appropriate physical and technical safeguards to protect the confidentiality, integrity and availability of information assets.
All employees, contractors and third parties shall be responsible to report all information security incidents, alerts, warnings and suspected vulnerabilities, in a timely manner, and share such information only with authorized personnel.
Qualifications
What we look for
Assess organization’s potential data sources (internal & external), develop blueprint to integrate, centralize, protect and maintain data, thus enabling data models for the enterprise.
Work with leadership and stakeholders to define the overall vision, mission, and strategy for business data models that scales up for the enterprise
Mentor & guide technical project teams based globally on Azure platform components
Use your specialized data architecture capability to optimise and maximise the data architectures being built in the OpenBlue.
Collaborate with your team to create & deliver Data Architecture Patterns which are consistent with a business that has significant growth aspirations and a competitive intent to leverage value from technology.
Assess new data technologies and articulate the transition from current to future state, striking the balance between standardization, customization, innovation, costs and risk management, while staying abreast of disruptive technology trends that can impact or benefit the business
Responsible for ensuring new data technologies and innovations are integrated into the organization, advising and recommending data architecture strategies, decisions, processes, tools, standards and methodologies.
Make bold technology decisions from a position of informed technical authority.
Required
Bachelor of Science degree and/or Masters in Software Engineering / Computer Science or an equivalent discipline
Minimum 12-16 years of experience in the Data Engineering space using above technologies with owning number of complex & high-volume data projects / products with global rollouts as an Architect is mandatory.
Proven expertise of Azure data platform, CI/CD
Strong knowledge of Hadoop, MPP Data platform, Azure Data Factory, Azure HDI, Power BI and Open source technologies
Superior documentation and communications skills
Superb problem solving and outstanding communication skills with the ability to describe complex & abstract technical concepts to peers and engineers.
Familiarity with Data Streaming patterns and hybrid real-time / lambda integration architectures.
Familiarity with Machine Learning, Data Science & AI Technologies.
Experience building and operating IoT Solutions
5+ years utilizing Agile Product Management- Scrum
Technology Requirement
Strong Leadership, organizational, and prioritization skills
Excellent written and verbal skills to communicate technical solutions to business teams
Demonstrated experience leading and influencing individuals and teams without formal organizational control
Understanding trends, new concepts, industry standards and new technologies in Data and Analytics space
Expert level knowledge in two or more of the following subject areas specialisms, so that you can lead data technology architecture from a position of technical authority:
Data Engineering patterns and practices for efficient & optimised utilisation from raw data, IoT Data.
Data Warehousing, semantic layer definitions and scaled data consumption patterns.
Distributed compute and processing data in parallel.
Cloud agnostic data eco-system, specifically with Azure,Ali,AWS,GCP.Openshift
Robust enterprise grade data integration, ingestion, management & pipelines.
Data streaming, Complex Event Processing and associated ”Lambda” style and Event Driven data architectures.
Operationalisation of Data Science outcomes, including continuous model qualification.
Hands-on Experience on Spark framework
Excellent programming background using Java, Scala or Python
Hands-on experience with multiple databases like PostgreSQL,Snowflake,MS SQL Server, NOSQL (HBase / Cassandra, MongoDB)etc
Data integration – Ingestion mechanism like Kafka Confluent/ Confluent Cloud.
Hands on experience in solutions built on Kubernetes can use industry-standard patterns and tools also has good knowledge/experience on portability, experience in component deployment on any platform (public or private).
Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming
Good conceptual knowledge in data modelling using dimensional & transactional modelling
Demonstrate strong analytical and problem-solving capability
Good understanding of the data eco-system, both current and future data trends
Exposure to data visualization techniques and analytics using tools like Tableau would be required.
Job Engineering
Primary LocationIN-Maharashtra-Pune
Organization Bldg Technologies & Solutions
This post is listed Under Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company