Data Engineer

Share on facebook
Share on twitter
Share on linkedin
Share on whatsapp

Description

Simpson Booth’s client, a UK-based FinTech company, seek a skilled Data Engineer for their Central London office on a Full-Time/Staff basis.

As part of the Chief Revenue and Platform Officer function, and reporting to the Head of Data Science and Analytics, the Data Engineer will be working on planning, engineering and maintaining clients’ data warehouses. As the company’s client base grows they will be delivering data warehouses for each client who requires it. This is an opportunity to be involved in the design, deployment and maintenance of multiple warehouse solutions across multiple clients.

The Data Science and Analytics team is accountable for:

  • Maintenance of the UK company Data Warehouse and associated data pipelines
  • Production and maintenance of BI reports and dashboards, used throughout the organisation
  • Production and maintenance of regulatory reports
  • Data engineering support when migrating large new clients onto the platform
  • Validating significant changes to system functionality, particularly in relation to those that might impact members’ pension holdings
  • Maintaining client reports which integrate with some of our clients’ systems
  • Helping the organisation be more data literate, enabling them to self-serve their data more efficiently and effectively.

About you

  • As a data engineer you’ll get a kick out of interacting with data in all its forms. More importantly, you get excited by the prospect of using data to solve problems. If you can use your skills to solve a business problem, that’s a great day’s work for you.
  • You will need to have strong skills in database technologies as well as Python. You’ll also be able to talk to techies and non-techies alike and make suggestions about solutions to problems that can be understood and implemented.
  • The company strives to provide a comfortable, relaxed working environment and the Data Science and Analytics team is no different. We take great pride in our work and strive to hit sometimes challenging deadlines, but we like to make sure that people are enjoying their work as we do it. This is not an easy balance to strike but being open and honest with each other is an important step in the right direction.

 

What you’ll be responsible for

  • work with stakeholders to identify new data sources that would add business value
  • implementing new data pipelines for new or existing clients
  • maintaining existing data pipelines and handling errors as necessary
  • monitoring the performance, security and scalability of the warehouses
  • evaluating the existing data warehouse solution to determine necessary updates and integration requirements and make sure final designs meet the clients needs
  • impact assessing change requests, ensuring that changes don’t break existing workflows

Required Experience

  • Prior experience in Python Development – with a focus on data analysis/migration/engineering
  • Prior experience in data engineering, with proven experience in Data Modelling
  • Experience in writing, troubleshooting, and debugging complex SQL queries
  • Experience writing, troubleshooting, and debugging Python Code (v 3.6+)
  • Ability to write automated Python tests (unit, functional, and integration) to ensure code works as expected
  • Experience of extracting data from REST APIs and manipulating/loading it into a database
  • Working knowledge of Snowflake data warehouse
  • Experience with ETL/ELT processes (preferably with Matillion)
  • Hands-on experience with AWS (Fargate and S3)
  • Financial services experience

Required skills

  • Experience working with Tableau Visualisation tool
  • Experience designing and implementing data warehouses
  • A working understanding of Docker
  • Experience working with big data and/or MPP (massively parallel processing) databases.
  • Experience supporting applications running on Amazon AWS
  • Understanding of cloud computing security concepts
  • Experiences in Continuous delivery and automation pipelines
  • Experience with Google Analytics
  • Monitoring AWS CloudWatch, Datadog