Hadoop Data Platform Architect Job Vacancy in Accenture Bengaluru, Karnataka – Updated today
Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for
Full Details :
Company Name : Accenture
Location : Bengaluru, Karnataka
Position :
Job Description : About Accenture: Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Interactive, Technology and Operations services-all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 674,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries.We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com
Accenture | Let there be change
We embrace change to create 360-degree value
www.accenture.com
Project Role :Data Platform Architect
Project Role Description :Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models.
Management Level :9
Work Experience :6-8 years
Work location :Bengaluru
Must Have Skills :Hadoop
Good To Have Skills :Big Data Analysis Tool and Techniques,GCP – Cloud Dataflow,NoSQL
Job Requirements :
Key Responsibilities : Experience with architecting, implementing and/or maintaining technical solutions in virtualized environments Experience in design, architecture and implementation of Data warehouses, data pipelines and flows Candidate should know the following services and have hands-on expereince in Google BigQuery Google Storage Dataproc PySpark
Technical Experience : Data extraction from relational source and data transformation using dataframes in PySpark Loading of processed data in Bigquery from GCS using GCP Dataproc Experience with developing software code in one or more languages such as Java and Python Proficient with SQL Experience with Relational Databases, NoSQL Databases and Big Data technologies
Professional Attributes : Excellent analytical and problem-solving skills Ability to understand business needs Excellent verbal/written communication skills Ability to work in a team
Educational Qualification : Btech
Additional Information : Google Professional Data Engineer or AWS Big Data Specialty Certification
15 years of full time education
This post is listed Under  Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company