Software Development Engineer, Performance Advertising Core Engine Job Vacancy in ADCI – Karnataka Bengaluru, Karnataka – Updated today

Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for

Full Details :
Company Name :
ADCI – Karnataka
Location : Bengaluru, Karnataka
Position :

Job Description : Bachelors (BS/BE) in Computer Science or related field
2+ years of experience in software development and full product life-cycles
Top notch coding skills in Java and/or C++ coupled with strong base in object-oriented design
Ability to build high-performance, highly-available and scalable distributed systems
Knowledge of data structures, algorithms, and designing for performance, scalability, and availability
Experience in SQL and data modeling skills
Strong sense of ownership and drive
Sharp problem solving skills and ability to resolve ambiguous requirements
Ability to learn new technologies and systems

Job summary
We innovate for petabyte scale big data management and analytics every single day! We solve compliance and privacy challenges such that our customers don’t have to!

Performance Advertising Core Engine (PACE) develops core features and tools to support advertising programs across the Amazon Advertising organization. The mission of the PACE Analytics and Data Management (ADM) team is to provide data that helps the advertising organization make informed analyses and decisions for our customers and to determine and deploy investments for future growth. We do so in a reliable, scalable, policy compliant (GDPR, CCPA), via an easily discoverable data lake that allows our internal customers such as data engineers, business analysts, ML engineers, research scientists, economists and data experts to collect what they need via self-service tools.

ADM team owns Spektr. Spektr is an internal analytics solution that brings together data for WW Advertising business in one location and provides tools for users to query, visualize, and run computations on that data. The solution is built using native AWS technologies and open source stacks. The individual micro-services are Lambda based, with MWAA-based orchestration managing Spark jobs that run on EMR clusters. All job management aspects of the pipeline are configurable and self-serve to our customers, with UI support. The pipeline currently manages a throughput of several peta-bytes each week and is expected to grow 3-5x of the current volume in 2022. Our guiding tenets are:
1. Quality over quantity
2. Tribal knowledge is the enemy
3. Good governance drives trust
4. Simplicity improves efficiency
5. Speed wins

A successful candidate will have demonstrated experience in at least some of the following areas: building distributed high scale systems, building and scaling very large data processing platforms using distributed computing technologies such as Map/Reduce, Spark, transactional systems and data persistence using DynamoDB/RDBMS, algorithm development for efficient scheduling, sampling, queuing, classification, etc

Key job responsibilities

Design and deliver services and solutions that solve important customer problems.
Collaborate with engineers, product and program managers across geographies and teams on deliveries and goals.
Troubleshoot and support your and your team’s products and services. What you create is also what you own.
Experiment with and learn AWS and open source tools and technologies that can help accelerate product innovation.

A day in the life

Interact with data engineers, business analysts, ML engineers and other data experts across the organization to understand evolving demands from advertising data.

About the team
The mission of the PACE Analytics and Data Management (ADM) team is to provide data that helps the advertising organization make informed analyses and decisions for our customers and to determine and deploy investments for future growth via a set of central and unified products and services.

Our team focuses on simplicity, usability, speed, compliance, cost efficiency and enabling high-velocity decision making so our customers can generate high quality insights faster.

Advanced Degree (MS/ME) in computer science or related discipline or 4+ years of relevant industry experience
Expertise in Big Data technologies such as EMR, S3, Spark, Kinesis
Experience with AWS technologies
Proficiency with at least one of these scripting languages: Python/Shell/Groovy/Scala

This post is listed Under  Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *