Data Engineer Job Vacancy in TechGinia Global Pvt. Ltd. Gurgaon, Haryana – Updated today
Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for
Full Details :
Company Name : TechGinia Global Pvt. Ltd.
Location : Gurgaon, Haryana
Position :
Job Description : BRIEF ABOUT THE JOB A data engineer’s primary role is to prepare data for analytical or operational uses. The Data Engineer integrates, consolidates and cleanse data and structure it for use in analytics applications. They aim to make data easily accessible and to optimize their organization’s big data ecosystem.REQUIRED SKILLS TECHNICAL & SOFT SKILLS ·Strong programing skills, being well versed in Object-Oriented Programming system (OOPS), data structures, and algorithmsShould be comfortable in executing ETL (Extract, Transform and Load) processes which include data ingestion, data cleaning and curation into a data warehouse, database, or data platformShould be comfortable with schema designingExperience in distributed computing environmentExperience in structured/unstructured data and batch processing/real-time processing (good to have)Be comfortable with SQL (mandatory), Python(mandatory), Scala (good to have) to manipulate and prepare data and conduct various analysis as needed · Readingwriting data tofrom various sources APIs, cloud storage, databases, big data platformsExperience of working with Big Data environment such as Hadoop and the ecosystemData transformations and applying ML modelsCreating web services to allow create, read, update and delete (CRUD) operationsCompetent in project management framework such as AgileExcellent communication skills, both written and verbalKnowledge of Machine learning, Statistical Modelling and natural language processing would be an added advantageRESPONSIBILITIES · Connecting, designing, scheduling, and deploying data warehouse systems · Developing data pipelines and enable dash boards for stakeholders and · Develop, construct, test and maintain system architectures · Create best practices for data loading and extraction · Doing quick POCs for any data eccentric development taskFRAMEWORK/ TOOLS KNOWLEDGE REQD · Python pandas, djangoflask, sklearn, scikit · SQL, BigQuery · Hadoop ecosystems (HDFS, HIVE, Mapreduce, Pig, Spark, Hadoop etc.) · Kafka · Apache Spark · Linux · AirflowREQUIRED EXPERIENCE 3 – 5 yearsQUALIFICATION B.TECH/ BCA/ GRAD. IN RELATED SUBJECTCTC AS PER INDUSTRYBENEFITS 5 DAY WORKING, WORK FROM HOME TEMPORARILY, EMPLOYEE INSURANCEJob Type: Full-timeSalary: From ₹45,000.00 per monthBenefits:Health insuranceSchedule:Day shiftMonday to FridayWork Remotely:Temporarily due to COVID-19
This post is listed Under Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company