Remote Cloud Data Engineer (PySpark, Kinesis, DMS, EMR, and Lambda) Job Vacancy in Turing Remote – Updated today
Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for
Full Details :
Company Name : Turing
Location : Remote
Position :
Job Description : Apply to this job here: https://turing.com/Nrdee7Only candidates with 8+ years of software development experience are eligible for this role.Turing.com is looking to hire a Cloud Data Engineer on behalf of a national leader in rewards programs and loyalty solutions. The engineer will be responsible for designing and developing cloud-scale data lakes. The candidate is expected to have significant experience ingesting disparate data sources, managing dataflows, and processing data into a data lake-house environment. The company aids retail operators in building their brands, optimizing retail operations, and rewarding their customers with a suite of marketing solutions. Data engineers who are excellent at problem-solving, good at systems thinking, and can work with disparate systems to deliver quality data lake-houses, are well-suited for this position.Here’s everything you need to know to become a Remote Cloud Data Engineer at Turing: http://turing.com/s/wvGFjWAll geared up to become a Turing remote developer at a leading company? Take our exciting coding challenge: https://turing.com/Nrdee7Job Responsibilities: Design and build production data pipelines for ingesting consumptions in a big data architectureWork with various AWS tools like Spark (PySpark), Glue, Hudi, Kinesis, DMS, EMR, and Lambdas for processing data.Develop a Data lake on S3 with landing, raw, trusted, and curated zonesConfigure AWS Redshift / Redshift Spectrum Data LakeHouseWork on automating processes and infrastructure using Python CDK.Job Requirements: Bachelor’s/Master’s degree in Engineering, Computer Science (or equivalent experience)At least 8 years of relevant experience in data engineeringExperience in working with Apache Spark RDDs and building data frames along with the ability to create Spark jobs for data transformation and aggregationMust be able to work with lue, PySpark, Kinesis, DMS, EMR, and LambdaKnowledge of Kubernetes (EKS), AWS, Kinesis, Istio, and Jaegar is essentialMust possess an expertise to work with Python, Redshift/Redshift Spectrum, Athena, and also SQLUnderstanding how to work with various file formats like Hudi, Parquet, Avro, ORC for large volumes of dataExperience in working with NoSQL databases like Cassandra, DocumentDB, DynamoDB is a plusShould be well-versed with data warehousing, data lakes, and the data lakehouse conceptsDeep understanding of Git, Gitflow based workflows, and CI/CDWell-versed with modern engineering best practices and design principlesMust have excellent communication, presentation, strongly analytical, and problem-solving skillsHow to Become a Turing Developer: Create your account onFill in your basic information (name, number, location, previous salary, experience, etc.)Solve multiple-choice questionsSchedule a technical interviewFinal OnboardingPerks & Benefits: Earn salaries higher than local standardsWork alongside a community of Google, Facebook, Microsoft engineersExperience rapid career growthNo visa requirements to work with the best US companiesBetter work-life balanceApply today before the vacancies are filled. Looking for more exciting job opportunities? Find more Turing US software jobs now: https://turing.com/Y70dD7Hear from developers themselves. Read Turing.com reviews here: https://turing.com/JE5nbrJob Type: Full-timeBenefits:Flexible scheduleSchedule:Flexible shiftEducation:Bachelor’s (Preferred)Experience:Software development: 3 years (Preferred)Language:English (Preferred)Application Deadline: 14/03/2022
This post is listed Under Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company