Data Engineer, Chennai Job Vacancy in Kaarlo Training & HR Solutions Pvt. Ltd. Chennai, Tamil Nadu – Updated today

Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for

Full Details :
Company Name :
Kaarlo Training & HR Solutions Pvt. Ltd.
Location : Chennai, Tamil Nadu
Position :

Job Description : Create and maintain optimal data pipeline architecture,

Assemble large, complex data sets that meet functional / non-functional business requirements.

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Requirements
Data Engineer with 5+ years of work experience in design and implementation of Data Engineering projects, which include building Data Pipelines, Data Modeling by leveraging AWS / GCP/ Azure Cloud services.Strong expertise in building Data Pipelines using Python, PySpark, SQL, Hadoop ecosystem, and AirflowStrong SQL knowledge and experience of working with a variety of databasesExperience in Bash/Shell Scripting, Linux, UNIXExperience working with both Relational and NoSQL DatabasesStrong Analytical and Communication SkillsExcellent written and verbal communication skills.
Benefits
Good Career growth

Great work culture
Salary: 10 to 15 LPA

• Data Engineer with 5+ years of work experience in design and implementation of Data Engineering projects, which include building Data Pipelines, Data Modeling by leveraging AWS / GCP/ Azure Cloud services. • Strong expertise in building Data Pipelines using Python, PySpark, SQL, Hadoop ecosystem, and Airflow • Strong SQL knowledge and experience of working with a variety of databases • Experience in Bash/Shell Scripting, Linux, UNIX • Experience working with both Relational and NoSQL Databases • Strong Analytical and Communication Skills • Excellent written and verbal communication skills.

This post is listed Under  Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *