Data Engineer Job Vacancy in Adaptavist London – Updated today

Are you looking for a New Job or Looking for better opportunities?
We got a New Job Opening for

Full Details :
Company Name :
Adaptavist
Location : London
Position :

Job Description : About Adaptavist
Adaptavist is a global software innovator, enabling organisations to digitally transform how they collaborate and get work done. Founded in 2005 and with a growing global reach, our customers cut across every major industry and include more than half of all Fortune 500 companies.
Today, Adaptavist leverages its deep technical understanding of Atlassian and innovative partnership with Slack to help organisations embrace new ways of working to achieve competitive advantage.
Through trusted consultancy, app development, training, hosting, and licensing solutions, Adaptavist has established itself as the go-to partner of choice within the Atlassian Ecosystem.
About the role
The role of Data Engineer will join our Data & Analytics Engineering team to build and scale reliable data pipelines, analytics and machine learning infrastructure, as well as help build new data products for the business.
With the help of the existing team of Data and Analytics Engineers, this role will be pivotal to the growth of the organisation, working closely with the rest of the Data department to take Adaptavist to the next level of data maturity. The Data Engineer will have the opportunity to create Airflow-based batch data pipelines, implement new solutions for streaming and real-time data, as well as create infrastructure to support our data organisation as a whole.

They will utilise best engineering practices to ensure scalability, readability and cleanliness of code, including but not limited to unit and integration testing, documentation, and CI/CD frameworks. This role will work closely with our Insights, Infrastructure and Engineering teams.
What you’ll be doing
Creating and maintaining Airflow dags in Python to orchestrate data ingestion from source systems to data warehouse, databases and data lake
Creating and maintaining AWS infrastructure using IaC used to orchestrate batch and data streaming data ingestion pipelines
Maintaining and troubleshooting the totality of data ingestion pipelines, responding to requests and escalations from internal and external teams and stakeholders while on “rota”Writing infrastructure as code in Terraform, utilising modules and templates provided by dedicated teams
Writing automated unit and integration tests in Python, utilising provided frameworks, to ensure code maintainability and resilience
Collaborating with the Analytics Engineers and Insights team to enable self-service analytics and maintain high data quality throughout
Preparing and maintaining architecture, infrastructure, and data models for data science applications
Liaising and coordinating with external data providers to determine how to integrate data into the existing environment and data model
Using modern data tools to support products and services by solving data integration challenges
Creating and maintaining detailed technical documentation to support onboarding of new team members and avoid knowledge silos forming
Participating in team ceremonies and collaboration as required, such as retrospectives, brainstorming and pair programming
Communicating with internal and external stakeholders as necessary when working on BAU requests or larger projects
What we’re looking for
Experience creating and maintaining data ingestion pipelines in Python (for example, for REST endpoints etc)
Experience with AWS stack (ie S3, Redshift, Athena, Kinesis, Glue)
Experience with modern development tooling, such as CI/CD tools, git etc
Experience with execution frameworks (ie Apache Airflow)Proven experience solving complex problems in a timely manner
Experience writing unit and integration tests Experience with both batch processing and streaming architectures
A few nice to haves
Familiarity with the Atlassian suite of tools
Experience with any of the following: Hadoop, Spark, Tensorflow, Kafka
Experience with Amazon Firehose, Kinesis or other Event ingestion frameworks
#LI-Remote
Please check out our website to review our global benefits!

At Adaptavist, we are committed to promoting a diverse and inclusive community, and believe this positively impacts both the creation of our innovative products and our delivery of bespoke solutions to our global customers and our own unique culture. We encourage all qualified applicants, regardless of age, disability, race, sexual orientation, religion or belief, sex, gender identity, pregnancy and maternity, marriage, and civil partnership status. From our family-friendly policies to our flexible work environment we offer a range of benefits and policies in order to support staff from all different backgrounds. If you have any questions, please do ask us. We look forward to your application!

This post is listed Under  App Development
Disclaimer : Hugeshout works to publish latest job info only and is no where responsible for any errors. Users must Research on their own before joining any company

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *