Data Engineer

Job Title:Data Engineer
Location UK ~ Nottingham
Type: Full Time, Permanent

Company Overview

Quotient Sciences is a drug development and manufacturing accelerator providing integrated programs and tailored services across the entire development pathway. Cutting through silos across a range of drug development capabilities, we save precious time and money in getting drugs to patients.

As a growing and successful business, we employ more than 1,100 talented individuals globally, located at state-of-the-art development, manufacturing and clinical facilities in the UK and USA.

Science, Agility and Culture are the core components that define Quotient Sciences, enabling us to do what we do in the way that we do it. People join Quotient Sciences because we are a respected member of the drug development community, driven by an unswerving belief that ideas need to become solutions, and molecules need to become cures, fast. Because humanity needs solutions, fast.

The Role

The Data and Analytics team is an integral part of the business. We build and deliver the data infrastructure that powers our analytics activity, allowing us to drive value from our data via the provision of analysis, reporting and insights internally to teams across the full span of the business.

We’re looking for an exceptional Data Engineer to join the team and contribute towards the building of new data architecture and delivery of cutting-edge solutions using advanced tools and technologies in the data and analytics space.

The role requires a strong technical understanding of data platforms, data warehousing and visualisation tools. You’ll need to be adaptable, flexible and resourceful in delivering a challenging roadmap whilst our data landscape is transforming. This is a crucial appointment and an amazing opportunity for an ambitious Data Engineer looking to be part of a growing part of the business.

Key responsibilities:

  • Develop and maintain scalable cloud-based data-warehousing infrastructure at Quotient as a single source of truth for the organisation’s data assets
  • Lead the design, implementation and monitoring of the cloud data platform.
  • Map, learn and understand a broad range of Quotient data resources (online and offline) and help progressively integrate them into the EDW over time using repeatable ETL pipelines.
  •  Develop and maintain Data Engineering tools for automation, monitoring and alerting of the data pipelines. Develop metrics to measure data quality and cadence/value delivery.
  • Development of technical solutions that fit into the overarching architecture, technology and data strategy
  • Work alongside our BI Analysts and use a range of analytical techniques to solve complex problems and drive decision making (key topics include pricing, customer analytics, forecasting, operational effectiveness, drug discovery, etc.)
  • Work as part of an Agile team to deliver business-focused requirements on the data platform
  • Advise development teams on best practices for the use of data technologies and coding practices
  • Develop and share knowledge on new technologies across our analytics community
  • Contribute to the evolution of the platform by evaluating new technologies and building prototypes covering topics such as machine learning, streaming or API-driven services
  • Build data expertise and leverage data controls to ensure privacy, security, compliance, data quality, and operations for allocated areas of ownership

The Candidate

Essential technical skills:

  • 6+ years’ experience in hands-on data engineering roles (ideally with direct experience in a low maturity company transforming its data and analytics estate)
  • A deep understanding of data warehousing principals in cloud-based technologies (such as AWS Data Stack)
  • Expertise in ETL tools, platforms and data pipeline development
  • Significant data architecture, data modelling, database design and development experience
  • Highly proficient in SQL/T-SQL. Knowledge and practical application of Python
  • Experience in end-to-end data orchestration – knowledge of workflow scheduling tools such as Oozie or Airflow
  • Experience in integrating datasets from disparate online and offline systems incl. Salesforce ecosystem, Finance Systems, process systems, static/excel datasets
  • Data performance management (monitoring query performance, enabling steps to improve performance like archiving, indexing, partitioning, compression etc.)
  • Strong understanding of application security concepts, including securing production applications and delivery pipelines
  • Strong understanding of data governance concepts such as data ownership and data stewardship
  • Experience of delivery of scale data engineering, integration and analytics projects
  • Experience in the usage of data reporting platform, such as Tableau

Non-technical skills:

  • Strong architecture design skills; being able to discuss design and trade-offs of different approaches and technologies
  • Naturally inquisitive, with an eye for detail and the willingness to learn new tools and techniques.
  • A methodical, analytical and logical thinker with the ability to see the bigger picture and get to the underlying business questions and challenges that need answering.
  • Outstanding numerical, technical and problem-solving skills.
  • Ability to build and maintain strong relationships with both technical and non-technical colleagues, translating complex concepts into simple terms where required.
  • A team player with a passion for working in a collaborative environment.
  • Extremely organised, delivery focused and deadline driven. The ability to prioritise workload and manage stakeholders effectively.
  • Applied knowledge of Agile principles and frameworks like Kanban and Scrum. Roadmap delivery using tools like Jira.
  • Good commercial awareness.
  • The ability to thrive in a risk-taking, fast-paced environment. Enjoy a hands-on role.
  • Ambitious and proactive.
  • A confident and conscientious communicator.

Desirable skills:

  • The design and implementation of data engineering, ingestion and curation functions on AWS cloud using AWS native or custom programming.
  • Experience working in an AWS Data Stack (AWS Glue, Lambda, S3, Athena, Sagemaker, etc)
  • Experience and knowledge of the Pharmaceutical industry and associated data
  • Knowledge and experience using other programming and/or statistical languages (e.g. Python, R Studio etc.).
  • Release management experience using Github or Jupyter notebooks

Company benefits:

In return, you will receive a competitive salary, a generous benefits package, excellent training and development, as well as an exciting career within a fast paced and dynamic business.

Our Commitment to Diversity & Inclusion

Quotient Sciences are advocates for positive change and conscious inclusion.

We strive to create a diverse Quotient family by remaining committed to the development of our culture, equality, mindset, diversity & inclusion in the workplace.

As a global employer, we recognise the value in having an organisation that is a true reflection and representation of our society today. All applicants welcome.