Lead Network Automation & Developers Job Vacancy in Airtel India Gurgaon, Haryana – Updated today
Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for
Full Details :
Company Name : Airtel India
Location : Gurgaon, Haryana
Position :
Job Description : Job Description:
This is a very inspiring role in Networks Digitization team where you will use your knowledge and/or experience in Scripting/Coding/Data Science methodologies and apply them to develop in-house close loop AI driven self organizing platform. Also responsible to develop analytics based solutions that produce telecom Networks and customer experience use cases as well as quantitative and qualitative business insights. You will work in a highly collaborative environment where you communicate and plan tasks and ideas.
Responsibilities and Duties:
Responsible for platform/tool development & maintenance – building AI/ML driven closed-loop automation self-organizing networks (SON) platform
Responsible for handling automation issues & QA testing
Responsible for Design and delivery of XML template, API-based e2e Networks automation platform including real time data pipelining
Responsible for developing and establishing the automatic regression tests that will support Airtel self organizing networks implementations
Automation of test scripts and assist in validating of scenarios You will investigate issues by reviewing/debugging solutions, provide workarounds, and review changes for operability to maintain existing software solutions
Responsible for developing scientific methods, processes, and systems to extract knowledge or insights to drive the future of applied analytics in Networks.
Use predictive modeling to enhance Network experience customer experiences, revenue generation and other business outcomes.
Must be able to communicate effectively and have detailed knowledge of data preparation and cleaning, algorithm selection & design, results analysis, industrialization.
Working closely with other developers, UX designers, business and systems analysts.
Qualifications and Skills:
Minimum 6~12 month experience in Devops with coding experience in OO languages (C++/ Python/Scala/PySpark/Java, Java scripting), ability to write stable, maintainable, and reusable code. Good understanding of Hadoop framework & big data handling & Analytics.
Building/handling of real time data pipeline using apache Kafka, spark & Hadoop .
Experience with SQL/NoSQL databases such as: MySQL and mongo DB.
Good understanding with RESTful APIs. Understanding in Statistics, e.g., hypothesis formulation, hypothesis testing, descriptive analysis and data exploration. Understanding in Machine Learning, e.g., linear/logistics regression discriminant analysis, boosting /bagging etc . Ambition to learn and implement current state of the art machine learning frameworks such as Scikit-Learn, TensorFlow, and Spark. Familiarity with Linux/OS X command line, version control software (git). Programming or scripting to enable ETL development
Academic Qualifications
Bachelor or equivalent degree in Computer Science, Computer Engineering, IT
This post is listed Under  Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company
