Data & Analytics Tech – GCP – Manager Job Vacancy in PwC Bengaluru, Karnataka – Updated today
Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for
Full Details :
Company Name : PwC
Location : Bengaluru, Karnataka
Position :
Job Description : Line of Service
Advisory
Industry/Sector
Not Applicable
Specialism
Data and Analytics Technologies
Management Level
Manager
Job Description & Summary
A career in our Advisory Acceleration Centre is the natural extension of PwC’s leading class global delivery capabilities. We provide premium, cost effective, high quality services that support process quality and delivery capability in support for client engagements.
GCP Data Engineer JOB DESCRIPTIONS
As a part of Analytics Insights practice, Enterprise strategy & Data management
capability, this position will provide a platform to work in strategic and tactical initiatives
across various industries. PwC offers its services to clients across the globe and helps
them with their enterprise wide business challenges in the area Data and Analytics.
Specific responsibilities for this role include architecting requirements across existing
and identified capabilities to ensure appropriate leveraging of existing resources as well
as the development of new capabilities that can be immediately used thereby
decreasing cycle times improving data capabilities and optimizing operating expenses
using GCP services
QUALIFICATIONS
Minimum undergraduate technical degree .B.E. or BTech or equivalent. or post-graduation
degree in Computer Applications
Experience: – 12-14 years of experience in Data engineering Essential Skills For GCP DE –
ETL
Experience in GCP .Google Cloud Platform.Experience in Architecting and Designing
solutions leveraging Google Cloud products such as Cloud BigQuery, Cloud
DataFlow, Kubernetes, Cloud Pub/Sub, Cloud Big Table and TensorFlow will be highly
preferred.
Experience in delivering artifacts such as scripts (Python), dataflow components
Strong experience with SQL
Experience in building and optimizing large scale data pipeline systems.
Experience in Big Data Analytic frameworks and query tools such as Spark Hadoop Hive
Impala etc.
Experience working with SQL and NoSQL databases like MongoDB Cassandra.
Good understanding of Big data design patterns and performance tuning.
Experience with data pipeline and workflow management tools: Azkaban Oozie etc.
Strong Data Warehousing experience building operational ETL or ELT pipelines
comprised of several sources and architecting Data Models or Layers for Analytics
Streaming Data Integration experience with Sqoop and Kafka
Experience in working on projects using NIFI
Able to Benchmark systems analyze system bottlenecks and propose solutions to eliminate
them.
Experience with Migration of Workloads from on-prem/other public clouds to Google
Cloud Platform.
Knowledge or working experience on 1-2 projects in an alternate cloud environment such as AWS or Azure will be a bonus
Education (if blank, degree and/or field of study not specified)
Degrees/Field of Study required:
Degrees/Field of Study preferred:
Certifications (if blank, certifications not specified)
Required Skills
Optional Skills
Desired Languages (If blank, desired languages not specified)
Travel Requirements
0%
Available for Work Visa Sponsorship?
No
Government Clearance Required?
No
Job Posting End Date
This post is listed Under Technology
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company