Our infrastructure stores and processes tens of billions of analytics events from hundreds of millions of devices. The Usage Tracking Team is responsible for developing business-critical tools that enable internal teams and customers alike to make sense of the data.
As a Data Engineer, you will help develop and maintain our data platform . This includes evaluating and deploying various data technologies and working directly with the stakeholders on specific data use cases. You will be responsible for our company data at every stage from ingestion to dashboards and reporting.
What You Will Do
- Design, develop, optimize and maintain data pipelines for the extraction, transformation, and loading (ETL) of data from various internal sources into our data infrastructure..
- Implement data quality checks and validation processes to guarantee the accuracy and consistency of data.
- Maintain clear and up-to-date documentation of data processes, pipelines and transformations
- Share your knowledge with other engineers in the team
- Educate data stakeholders on how they can derive value from the available data
- Proactively drive initiatives and projects, prioritising tasks and resources as necessary
Our Current Tech Stack
- Data Platform: BiqQuery and other GCP products
- Lookerstudio / Tableau
- Python
- Clickhouse
- Kafka
- Gitlab
Who You Are
You are an experienced engineer who is passionate about building pragmatic solutions that will empower the users from multiple organizations to make data-driven decisions. You excel in building and maintaining data infrastructure and have a good overview of existing technologies and best practices. You work effectively in the team setting and take pride in educating and mentoring others. As a data engineer you are committed to maximizing the value of the data to drive innovation and efficiency within the organization.
Ideally, you:
- 5+ years of professional experience as a Data Engineer, Software Engineer or similar role, with a strong background in data processing (Data Warehouse, Data Lake or Lakehouse)
- Experience with GCP (BigQuery, Dataform) is essential
- Are fluent in Python and SQL for data manipulation and querying of data
- Have knowledge on state of the art technologies to analyse and visualise large amounts of data
- Have profound understanding of data privacy and security best practices
- Have good communication skills and a structured approach to work
Who We Are
Could your code give superpowers? Whether enabling delivery drivers to make quicker deliveries, matching a patient with their medication or allowing retailers to make store operations more efficient, our technology automates workflows and provides actionable insights to help businesses in a variety of industries. This means we have no shortage of technical challenges for engineers like you. Join us, as we continue to expand, grow and innovate, and help take Scandit to the next level.
We are proud to be “Great Place to Work” accredited in the UK, USA, Japan, Finland & Switzerland and “Great Place to Work for Wellbeing” in the UK.
Imagine the What. Build the How.
At Scandit we strive to create an inclusive environment that empowers our employees. We believe that our products and services benefit from our diverse backgrounds and experiences and are proud to be a safe space for all.
All qualified applications will receive consideration for employment without regard to race, colour, nationality, religion, sexual orientation, gender, gender identity, age, physical [dis]ability or length of time spent unemployed.