Data Engineer (AI / ML)
Job: Data Engineer (AI / ML)
Role type: Permanent
Location: UK or Greece
Preferred start date: ASAP
LIFE AT SATALIA
As an organisation, we push the boundaries of data science, optimisation and artificial intelligence to solve the most complex problems in the industry. Satalia, a WPP company is a community of individuals devoted to working on diverse and challenging projects, allowing you to flex your technical skills whilst working with a tight-knit team of high performing colleagues.
Led by our founder and WPP Chief AI Officer Daniel Hulme, Satalia’s ambition is to become a decentralised organisation of the future. Today, this involves developing tools and processes to liberate and automate manual repetitive tasks, with a focus on freedom, transparency and trust. At the core of our thinking is an approach to wellbeing and inclusivity. We unpack human behaviour and unpick prejudice to ensure a safe and inviting environment. We offer truly flexible working and allow our employees to find the working practice that makes them most productive. At Satalia, your opinion matters and your achievements are celebrated.
THE ROLE
We are investing massively in developing next-generation AI tools for multimodal datasets and a wide range of applications. We are building large scale, enterprise grade solutions and serving these innovations to our clients and WPP agency partners. As a member of our team, you will work alongside world-class talent in an environment that not only fosters innovation but also personal growth. You will be at the forefront of AI, leveraging multimodal datasets to build groundbreaking solutions over a multi-year roadmap. Your contributions will directly shape cutting-edge AI products and services that make a tangible impact for FTSE 100 clients.
YOUR RESPONSIBILITIES
- Collaborate closely with data scientists, architects, and other stakeholders to understand and break down business requirements.
- Collaborate on schema design, data contracts, and architecture decisions, ensuring alignment with AI/ML needs.
- Provide data engineering support for AI model development and deployment, ensuring data scientists have access to the data they need in the format they need it.
- Leverage cloud-native tools (GCP/AWS/Azure) for orchestrating data pipelines, AI inference workloads, and scalable data services.
- Develop and maintain APIs for data services and serving model predictions.
- Support the development of LLM-powered features, including prompt engineering, LLM calls, agentic frameworks, vector databases and Retrieval-Augmented Generation (RAG) pipelines.
- Implement and optimize data transformations and ETL/ELT processes, using appropriate data engineering tools.
- Work with a variety of databases and data warehousing solutions to store and retrieve data efficiently.
- Implement monitoring, troubleshooting, and maintenance procedures for data pipelines to ensure the high quality of data and optimize performance.
- Participate in the creation and ongoing maintenance of documentation, including data flow diagrams, architecture diagrams, data dictionaries, data catalogues, and process documentation.
MINIMUM QUALIFICATIONS / SKILLS
- High proficiency in Python and SQL.
- Strong knowledge of data structures, data modelling, and database operation.
- Proven hands-on experience building and deploying data solutions on a major cloud platform (AWS, GCP, or Azure)
- Familiarity with containerization technologies such as Docker and Kubernetes.
- Demonstrable experience building, implementing, and optimizing robust data pipelines for performance, reliability, and cost-effectiveness in a cloud-native environment.
- Experience in supporting data science workloads and working with both structured and unstructured data.
- Experience working with both relational (e.g., PostgreSQL, MySQL) and NoSQL databases.
- Experience with a big data processing framework (e.g., Spark).
PREFERRED QUALIFICATIONS / SKILLS
- API Development: Experience building and deploying scalable and secure API services using a framework like FastAPI, Flask, or similar.
- Experience partnering with data scientists to automate pipelines for model training, evaluation, and inference, contributing to a robust MLOps cycle.
- Experience or familiarity with Retrieval-Augmented Generation (RAG) applications and modern AI/LLM frameworks (e.g., LangChain, Haystack, Google GenAI, etc.).
- Hands-on experience with vector databases (e.g., Pinecone, Weaviate, ChromaDB).
WE OFFER
- Benefits - enhanced pension, life assurance, income protection, private healthcare;
- Remote working - café, bedroom, beach - wherever works;
- Truly flexible working hours - school pick up, volunteering, gym;
- Generous Leave - 27 days holiday plus bank holidays and enhanced family leave;
- Annual bonus - when Satalia does well, we all do well;
- Impactful projects - focus on bringing meaningful social and environmental change;
- People oriented culture - wellbeing is a priority, as is being a nice person;
- Transparent and open culture - you will be heard;
- Development - focus on bringing the best out of each other;
Satalia is home to some of the brightest minds in AI and if you’re looking to join a company who not only values autonomy and freedom, but embraces a culture of inclusion and warmth, we’d love to hear from you.
We aim to respond to all applications within 2 weeks. If you have not heard from us within 2 weeks this means your application has been unsuccessful.
By applying to Satalia you are expressly giving your consent for the collection and use of your information as described within our Satalia Recruitment Privacy Policy.
Good luck!
- Locations
- London, Greece
- Remote status
- Fully Remote
Already working at Satalia?
Let’s recruit together and find your next colleague.