Senior Data Analyst Job Vacancy in Honeywell Bengaluru, Karnataka – Updated today
Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for
Full Details :
Company Name : Honeywell
Location : Bengaluru, Karnataka
Position :
Job Description : Deliver business value through Right and Fast partnership
Join a team recognized for leadership, innovation and diversity
Honeywell is charging into the Industrial IoT revolution with the establishment of Honeywell Connected Enterprise (HCE), building on our heritage of invention and deep, on-the-ground industry expertise. HCE is the leading industrial disruptor, building and connecting software solutions to streamline and centralize the assets, people and processes that help our customers make smarter, more accurate business decisions. Moving at the speed of software, we are creating, innovating and delivering solutions fast, challenging the way things have always been done, piloting new ways for all of us to work, and expecting our successes to set new standards for our customers and for Honeywell.
We are looking for a Sales CRM Business Analyst who will be responsible to work on large and significant Honeywell analytics programs that need expertise in enterprise wide skills to perform enterprise analytics. The incumbent will need to manage cross-business stakeholders and work very closely with the the Forge Insights data engineering team to deliver strategic programs to planned scope and schedule. The Business Analyst will play a key role in ensuring program deliverables are met and data quality requirements are produced on time and within budget.
The right person for the job will apply their SAP and Salesforce technical know-how with business understanding in solving real-world problems faced by our company and finding opportunities for improvement across multiple projects, teams and business units.
Job Responsibilities: Business experience with: SAP Modules
Sales and Distribution
Materials Management
Production (optional)
Salesforce CRM, especially opportunity, quote, and task Partner with the business to gather requirements related to SAP Sales and CRM initiatives Partner with the HIA team to translate business requirements into solutions Keep up to date with latest releases and capabilities of applications such as SAP and Salesforce Act as an intermediary between the technical data teams and client facing teams to coordinate shared solutions Play an instrumental role as part of the Forge Insights organization, in maximizing the value of Salesforce and related tools and improving client experience Focus on both xHoneywell business stakeholders and internal IT project needs to create an environment and process that ensures the successful delivery of the program objectives. Regularly audit data to uncover data integrity issues and/or opportunities for process improvements
YOU MUST HAVE Bachelor’s degree 8+ years IT experience with at least 4 years with SAP and Salesforce 3+ years of experience in SAP, CRM Experience using SAP and Salesforce or similar CRM Products to develop client or business solutions
As an Advanced Data Engineer, you will be part of a team that delivers contemporary analytics solutions for all Honeywell business groups and functions. You will build strong relationships with leadership to effectively deliver contemporary data analytics solutions and contribute directly to business success. You will develop solutions on various Database systems viz. Databricks, Hive, Hadoop, PostgreSQL, etc.
You will identify and implement process improvements – and you don’t like to the same thing twice so you will automate it if you can. You are always keeping an eye on scalability, optimization, and process. You have worked with Big Data before, IoT data, SQL, Azure, AWS, and a bunch of other acronyms.
You will work on a team including scrum masters, product owners, data architects, data engineers, data scientists and DevOps. You and your team collaborate to build products from the idea phase through launch and beyond. The software you write makes it to production in couple of sprints. Your team will be working on creating a new platform using your experience of APIs, microservices, and platform development.
YOU MUST HAVE
Bachelor’s degree in Computer Science, Engineering, Applied Mathematics or related field
6-8 years of data engineering experience
Should have developed and deployed complex big data ingestion jobs in Spark/Informatica BDM/Talend bringing prototypes to production on Hadoop/NoSQL/MPP platforms.
Should have minimum 4 years of hands on experience with Spark, Pig/Hive, etc. and automation of data flow using Informatica, Spark, NiFi or Airflow/Oozie.
Minimum 3 years of experience in developing and building applications to process very large amounts of data (structured and unstructured), including streaming real-time data – Python Spark and Scala. (Kafka, Spark streaming or other such tools).
Hands on experience in Databricks, Cloudera, Hortonworks and/or Cloud (AWS EMR, Azure Data Lake Storage) based Hadoop distributions.
Effective communication skills and succinct articulation
A Job posting does not exist for this global job code, please work with your HRG to develop one
WE VALUE Worked in Big data environment – Azure/AWS cloud, Hive, Hadoop or related databases
Drive and desire to learn and grow both technical and functional skill sets Hands on experience configuring Salesforce or similar CRM Product, including workflows, validation rules, and security controls Proven ability leveraging analytical and problem-solving skills in a fast-paced environment Strong presentation, communication (written and verbal) skills, and interpersonal skills Ability to juggle and prioritize multiple tasks within a collaborative team environment Demonstrates flexibility and willingness to do what it takes to get the job done Experience documenting user stories and creating to-be process flow diagrams Experience collaborating with business stakeholders Experience on a Salesforce implementation through the full Software Development Lifecycle (SDLC) Agile Project Methodology experience preferred. Minimum 2 years of experience in working with at least one NoSQL system (HBase, Cassandra, MongoDB etc.). In-depth knowledge of schema design to effectively tackle the requirement.
Experience in writing complex SQL statements
Experience in working with cloud-based deployments. Understanding of containers & container orchestration (Swarm or Kubernetes).
Experience in building advanced analytics solutions with data from enterprise systems like ERPs, CRMs, Marketing tools etc.
Experience with dimensional modeling, data warehousing and data mining
Good understanding of branching, build, deployment, CI/CD methodologies such as Octopus and Bamboo
Experience working with in Agile Methodologies and Scrum Knowledge of software best practices, like Test-Driven Development (TDD)
Database performance management and API development
Technology upgrade oversight
Experience with visualization software (Tableau, Spotfire, Qlikview, Angular js, D3.js)
Understanding of best-in-class model and data configuration and development processes
Experience working with remote and global teams and cross team collaboration
Consistently makes timely decisions even in the face of complexity, balancing systematic analysis with decisivenessA Job posting does not exist for this global job code, please work with your HRG to develop one
Additional Information
JOB ID: req337869
Category: Engineering
Location: HW Camp II,Bldgs 9A&9B,Plot C2,RMZ Ecoworld,Varturhobli,Sarjapur Marathahalli Outer Ring Road,Bangalore,KARNATAKA,560103,India
This post is listed Under Software Development
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company