Senior Data Engineer Job Vacancy in Morpheus Human Consulting Pune, Maharashtra – Updated today
Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for
Full Details :
Company Name : Morpheus Human Consulting
Location : Pune, Maharashtra
Position :
Job Description : Reference Code:448-37Job Title:Senior Data Engineer – Telecom – PuneCategory:TelecomJob Description:Our client is a leading technology services firm, operating across Asia Pacific, providing services and solutions in consulting, digital, technology, cybersecurity and more. They believe in the power of technology to make extraordinary things happen and to create lasting impact and value for our people, communities and partners. They bring together people and expertise to harness the best of technology. Diverse 10,000-strong workforce has delivered a wealth of large-scale, mission-critical, and multi-platform projects for governments and enterprises in Singapore and the APAC region.Specialties:Business Application Services, Communications Engineering, ICT Professional Services, Digital, Infrastructure Services, Cloud, Technology, Products, Platforms, Consulting, Digital Transformation, and CybersecurityJob Title: Senior Data EngineerSECTION A: POSITION SUMMARYState the objective and purpose of the role.This role is accountable to expand and optimize our data and data pipeline architecture under Singtel Data & Analytics within Group IT:1. Design, create and maintain optimal data pipelines2. Drive optimization, testing and tooling to improve data quality3. Review and approve solution design for data pipelines4. Ensure that proposed solutions are aligned and conformed to the big data architecture guidelines and roadmap5. Evaluate and renew implemented data pipelines solutions to ensure their relevance and effectiveness in supporting business needs and growth.SECTION B: KEY RESPONSIBILITIES AND RESULTSIndicate key responsibilities and performance indicators of this role. For existing role, please indicate additional responsibilities in bold.1. Design and implement data pipelines in Hadoop platform2. Understand business requirement and solution design to develop and implement solutions that adhere to big data architectural guidelines and address business requirements3. Fine-tuning of new and existing data pipelines4. Schedule and maintain data pipelines5. Drive optimization, testing and tooling to improve data quality6. Assemble large, complex data sets that meet functional / non-functional business requirements. 7. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, etc8. Build robust and scalable data infrastructure (both batch processing and real-time) to support needs from internal and external users9. Review and approve high level & detailed design to ensure that the solution delivers to the business needs and align to the data & analytics architecture principles and roadmap10. Understand various data security standards and use secure data security tools to apply and adhere to the required data controls for user access in Hadoop platform11. Support and contribute to development guidelines and standards for data ingestionKey Skills:Work ExperienceMinimum 6 years of experience in data warehousing / distributed system such as HadoopExperience with relational SQL and NoSQL DBExpert in building and optimizing ‘big data’ data pipelines, architectures and data setsExcellent experience in Scala or PythonExperience in ETL and / or data wrangling tools for big data environmentAbility to troubleshoot and optimize complex queries on the Spark platformKnowledgeable on structured and unstructured data design / modeling, data access and data storage techniquesExperience to do cost estimation based on the design and developmentExperience with DevOps tools and environmentExperience working in Telco Data Warehouse and / or Data Lake Technical / Professional Skills Please provide at least 3Hadoop / Big Data knowledge and experienceDesign & Development based on Hadoop platform and it’s componentsAWS ServicesInformatica Big Data ManagementPython / Scala / JavaHIVE / HBase / Impala / ParquetSqoop, Kafka, FlumeSQLRelational Database Management System (RDBMS)NOSQL databaseData warehouse platforms or equivalentAirflowJenkins / BambooGithub / BitbucketNexusAnsibleFortifyLocation:PuneRequired Experience:4-10 yrsPositions:2Job Types: Full-time, Contract
This post is listed Under Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company
