About the job
We’re looking for a Data Engineer to join our Datalab team. This is a permanent role and can be based in either London, Salford or Glasgow.
Please note – we offer hybrid working which means coming into the office 1-2 days per week.
BBC audiences expect the best content to be available to them in a single place, personalised to their preferences and interests. At the moment this is difficult for us to achieve since our content and audience data is distributed across systems that are hard to connect. We’re also missing metadata about lots of our programmes, which makes them difficult to discover. We’re currently unable to properly engage the next generation of TV license fee payers, many of whom already have less affinity with the BBC than the rest of the UK population.
Datalab was formed to address these issues, by creating a simpler way to discover content. We are doing this by bringing all of our data together into one place, and by using machine learning to enrich it. As we do this, we become able to match our programming with individuals’ interests and context. Our approach is to build a data platform that can be extended by other BBC teams, and which allows many different products to create consistent and relevant experiences for audiences.
In December 2018 we launched the first completely algorithmically driven (but editorially supervised) product in the BBC. While this is an experimental platform with an audience experience that will continue to evolve, it provides an insight into the broader capability we are building.
Our team objectives are:
- Make it easy for BBC teams to rapidly develop and deploy Machine Learning engines
- Provide great recommendations across multiple BBC products
We are aiming high and have an open brief to define what works best for our audience. We want to stay lean and move quickly to build, test and learn as we go so your contribution will make a difference from day one. We want everyone to feel responsible for our collective success.
You will help us create a data and machine learning environment that can scale to millions of users. You will help integrate new data sources and ensure that the code we write is robust and scalable. You have a keen interest in machine learning (but not necessarily previous experience). You are excited and knowledgeable about a tech stack that includes Google Cloud Platform, Python and Kubernetes with a commitment to micro-services and infrastructure as code.
You’ll engage with engineers working on other BBC apps and services, tapping into the wealth of knowledge and experience of an organisation already serving a vast global audience. Learning is an important part of the role, and you’ll have access to BBC Academy training programmes, along with the opportunity to attend technology conferences and use other resources to progress.
Ideally you will have the following skills / experience:
- Typically an advanced degree in computer science, computer engineering, other technical discipline, or equivalent work experience
- Strong software development and interest in the data space
- Experience with various agile or other rapid application development methods and experience with object-oriented design, coding and testing patterns
- Knowledge of key data structures and algorithms
- Knowledge of agile methodologies and modern software development processes (CI/CD, TDD, cloud development)
To apply for this job please visit www.linkedin.com.