Data Engineer Job Vacancy in Workday Pune, Maharashtra – Updated today

Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for

Full Details :
Company Name :
Workday
Location : Pune, Maharashtra
Position :

Job Description : Do what you love. Love what you do.

At Workday, we help the world’s largest organizations adapt to what’s next by bringing finance, HR, and planning into a single enterprise cloud. We work hard, and we’re serious about what we do. But we like to have fun, too. We put people first, celebrate diversity, drive innovation, and do good in the communities where we live and work.
About the Team
The Enterprise Data Services organization in Business Technology takes pride in enabling data driven business outcomes to spearhead Workday’s growth through trusted data excellence, innovation and architecture thought leadership. The team is responsible for developing and supporting Data Services, Data Warehouse, Analytics, MDM, Data Quality and Advanced Analytics/ML for multiple business functions including Sales, Marketing, Services, Support and Customer Experience. We leverage leading modern cloud platforms like AWS, Reltio, Tableau, Snaplogic, MongoDB in addition to the native AWS technologies like Spark, Airflow, Redshift, Sagemaker and Kafka
About the Role
Develop and automate high-performance data processing systems to drive Workday business growth and improve the product experience.
Evangelize high quality software engineering practices towards building data infrastructure and pipelines at scale.
Build reliable, efficient, testable, & maintainable data pipelines.
Design and Develop data pipelines using Metadata driven ETL Tools and Open source data processing frameworks.
Hands-on experience with source version control, continuous integration and experience with release/change management delivery tools.
Provide production support and resolve high priority incidents and the development coding issues.
Work with cross functional teams to enable data insights through Data lifecycle.
About You
Basic Qualifications:
6+ years of experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business.
Experience with very large-scale data warehouse and data engineering projects
Experience developing low latency data processing solutions like AWS Kinesis, Kafka, Spark Stream processing.
Should be proficient in writing advanced SQLs, Expertise in performance tuning of SQLs
Experience working with AWS data technologies like S3, EMR, Lambda, DynamoDB, Redshift etc.
Strong experience in one or more programming languages for processing of large data sets, such as Python, Scala.
Ability to create data models, STAR schemas for data consuming.
Extensive experience in troubleshooting data issues, analyzing end to end data pipelines and in working with users in resolving issues
BS/MS in computer science or equivalent is required
Optional Qualifications:
Prior experience with CRM systems like SFDC is desired
Experience building analytical solutions to Sales and Marketing teams.
Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process!

This post is listed Under  Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company

Similar Posts