Software Engineering Specialist – Big data, Hadoop Job Vacancy in amdocs Pune, Maharashtra – Updated today
Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for
Full Details :
Company Name : amdocs
Location : Pune, Maharashtra
Position :
Job Description : Job ID: 142779
Required Travel :Minimal
Managerial – No
Location: India- Pune (Amdocs Site)
Who are we?
Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers’ innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our 28,000 employees around the globe are here to accelerate service providers’ migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $4.3 billion in fiscal 2021. For more information, visit Amdocs at www.amdocs.com
In one sentence
Responsible for the design, development, modification, debugging and/or maintenance of software systems. Works on specific modules, applications or technologies, and deals with sophisticated assignments during the software development process.
What will your job look like?
Design, Create, Enhance & Support Hadoop data pipelines for different domain using Big Data Technologies.
In charge of Data Transformation, Data Models, Schemas, Metadata, And Workload Management
Perform Development & Deployment tasks, should be able to Code, Unit Test & Deploy.
Perform application analysis and propose technical solution for application enhancement, optimizing existing ETL processes etc.
Handle and resolve Production Issues (Tier 2 & weekend support) & ensure SLAs are met.
Create necessary documentation for all project deliverable phases
Collaborate with multi discipline interfaces: dev team, business analysts, Infra, information security, end users.
All you need is…
4+ years of Hadoop architecture and other relevant Big data tools like HBase, HDFS, Hive, Map-reduce etc.
Superb communication and collaboration skills
Independent and Self-learning attitude
Experience in working with Data Governance, Data Quality, and Data Security teams and standards.
Strong Experience with
Hadoop platform (Ambari etc.)
Kafka message queuing technologies
Apache Nifi Stream Data Integration
RESTful APIs and open systems.
Object-oriented/Object function scripting using languages such as R, Python, Scala, or similar.
Informatica BDM or other Big Data ETL tool to implement complex data transformations.
Strong ability to design, build and manage data pipelines in Python and related technologies.
Hands on in SQL, Unix & advanced Unix Shell Scripting.
Knowledge of handling xml, json, structured, fixed-width, un-structured files using custom Pig/Hive.
Knowledge of any cloud technology (AWS/Azure/GCP)
Good to have skills:
Experience working with Data Discovery, Analytics and BI software tools like Tableau, Power BI.
Knowledge of handling xml, json, structured, fixed-width, un-structured files using custom Pig/Hive.
Understanding/ experience with Data Virtualization tools like TIBCO DV, Denodo etc.
Understanding/ experience with Data Governance tools like Informatica Data Catalog etc.
Knowledge of any cloud technology (AWS/Azure/GCP) is a plus
Why you will love this job:
The chance to serve as a specialist in software and technology.
You will take an active role in technical mentoring within the team.
We provide stellar benefits from health to dental to paid time off and parental leave!
Amdocs is an equal opportunity employer. We welcome applicants from all backgrounds and are committed to fostering a diverse and inclusive workforce
This post is listed Under Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company