Data Engineer + API Developer Job Vacancy in Kyndryl Bengaluru, Karnataka – Updated today

Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for

Full Details :
Company Name :
Kyndryl
Location : Bengaluru, Karnataka
Position :

Job Description : 490259BR
Why Kyndryl
Our world has never been more alive with opportunities and, at Kyndryl, we’re ready to seize them. We design, build, manage and modernize the mission-critical technology systems that the world depends on every day. Kyndryl is at the heart of progress — dedicated to helping companies and people grow strong. Our people are actively discovering, co-creating, and strengthening. We push ourselves and each other to seek better, to go further, and we carry this energy to our customers. At Kyndryl, we want you to keep growing, and we’ll provide plenty of opportunities to make that happen. Please be aware that we have the Kyndryl candidate zone hosted by IBM for a certain period. If you have applied for an IBM role previously, you will be able to log into the candidate zone using your previous IBM log in details. When in the candidate zone, you will be able to see your previous applications for both IBM and Kyndryl. For further information on privacy, please visit www.kyndryl.com/privacy.
Your Role and Responsibilities
As a Data Engineer/API developer you are expected to be functionally knowledgeable in deploying and managing AI/ML models and APIs with strong emphasis on API development, API maintenance, model deployment and governance using cloud native services as well as 3rd party DSML platforms. This role in our Data and AI team, you will provide support for one or more projects; assist in defining scope and sizing of work; and work on Proof of Concept development. You will support the team in providing data engineering, model deployment and management solutions based on the business problem, integration with third party services, designing and developing complex model pipelines for clients’ business needs. You will collaborate with some of the best talent in the industry to create and implement innovative high quality solutions, participate in Pre-Sales and various pursuits focused on our clients’ business needs.

You will also contribute in a variety of roles in thought leadership, mentorship, systems analysis, architecture, design, configuration, testing, debugging, and documentation. You will challenge your leading edge solutions, consultative and business skills through the diversity of work in multiple industry domains.

Responsibilities

Design and build scalable machine learning services and data platformsDevelop Model Pipelines (DevOps) for model reusability and version controllingServe models in production leveraging serving engines such as Tensorflow, PyTorch, Seldon etc.Analyze, designs, develops, codes and implements programs in one or more programming languages, for Web and Rich Internet Applications, Cloud Native and 3rd Party Applications.Supports applications with an understanding of system integration, test planning, scripting, and troubleshootingDefines specifications and develop programs, modifies existing programs, prepares test data, and prepares functional specifications.Utilize benchmarks, metrics, and monitoring to measure and improve modelsDevelop integration with monitoring tools to detect model drift and alert – Prometheus, Grafana stack, Cloud native monitoring stackResearch, design, implement and validate cutting-edge deployment methods across hybrid cloud scenariosWork with data scientists to implement Client, AI and NLP techniques for article analysis and attribution.Support the build of complex AI/ML models and help in deploying them either on cloud or 3rd party DSML platformsContainerize the models developed by Data Scientist and deploy them in Kubernetes/Container environmentsDevelop and maintain documentation of the Model flows and integrations, pipelines etcSupport the teams in providing technical solutions from model deployment and architecture perspective, ensure the right direction and propose resolution to potential model pipeline, deployment -related problems.Developing Proof of concepts (PoC) of key technology components to project stakeholdersCollaborate with other members of the project team (Architects, Data Engineers, Data Scientists) to support delivery of additional project componentsEvaluate and create PoVs around the performance aspects of DSML platforms and tools in the market against customer requirementsWorking within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.Assist in driving improvements to the Enterprise AI technologies stack, with a focus on the digital experience for the user, as well as model performance & security to meet the needs of the business and customers, now & in the futureSupport technical investigations & proof of concepts, both individually and as part of a team, including being hands – on with code, to make technical recommendationsCreate documentation for Architecture Principles, design patterns & examples, technology roadmaps & future planning
Required Technical and Professional Expertise
Required Skills:

Python, Machine Learning Engineer and API development with 3-5 years of experience with following skills –
Strong DevOps, Data Engineering and Client background with AWS or GCP or Azure cloudExperience with design and development of REST API platform using Apigee/APIM, converting web services from SOAP to REST or vice-versa.Experience with Security frameworks (e.g., JWT, OATH2)Experience in API layer like security, custom analytics, throttling, caching, logging, monetization, request and response modifications etc. using ApigeeProficient in SQL and Stored Procedures such as in Oracle, MySQLExperience with Unix, Linux Operating SystemsExperience with Scrum and other Agile processes.Knowledge of Jira, Git/SVN, JenkinsExperience in creating REST API documentation using Swagger and YAML or similar tools desirableExperience with Integration frameworks (e.g., Mule, Camel) desirable
Experience with one or more of MLOps tools: ModelDB, Kubeflow, Pachyderm, and Data Version Control (DVC) etcExperience in Distributed computing, Data pipelines, and AI/ClientExperience setting up and optimizing DBs for production usage for Client app contextExperience in Docker, Kubernetes (OpenShift, EKS, AKE, GKE, vanilla K8s), Jenkins, any CICD toolExperience in Spark, Kafka, HDFS, CassandraStrong and handson knowledge in Python, Apache Spark, Kubernetes, PySparkHands-on expertise in at least 1 Data Science project – Model Training, Deployment on Hyperscalars – AWS, Azure, GCPExperience in any of the following solutions – AWS Sagemaker, Azure ML or GCP Vertex AI or 3rd party solutions like H2O.ai, Datarobot, etc
Preferred Technical and Professional Experience
Python programmerDevOps – CD/CI ImplementationsData Science skills – Model development, TrainingAPI developmentStrong knowledge of web services (WSDL Soap, Restful)Strong knowledge of the java/pythonframeworks (Spring MVC, Spring Security etc)
Required Education
Bachelor’s Degree
Preferred Education
Master’s Degree
Country/Region
India
State / Province
KARNATAKA
City / Township / Village
Bangalore
Being You @ Kyndryl
Kyndryl is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, pregnancy, disability, age, veteran status, or other characteristics. Kyndryl is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
Other things to know
When applying to jobs of your interest, we recommend that you do so for those that match your experience and expertise. Our recruiters advise that you apply to not more than 3 roles in a year for the best candidate experience.

For additional information about location requirements, please discuss with the recruiter following submission of your application.
Primary job category
Software Development & Support
Role ( Job Role )
Software Developer
Employment Type
Full-Time
Contract type
Regular
Position Type
Professional
Travel Required
No Travel
Company
(Y030) Kyndryl Solutions Private Limited
Is this role a commissionable/sales incentive based position?
No

This post is listed Under  Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company

Similar Posts