Senior Data Engineer

Posted 16 September 2022
Salary £60000 - £95000 per annum, Benefits: plus stock options
Job type Permanent
Contact NameAdam Stevenson

Job description

Senior Data Engineer - Up to £95,000 Fully Remote (UK) NO Sponsorship
AWS, Azure, Databricks, Python, ETL, ELT, CI/CD, SQL, Spark, Hadoop, Talend, Data lakes, Data warehouse
Our client who has just been named in Glassdoor’s top 50 places to work in 2022, is looking for a Senior Data Engineer to join their growing team. Despite being a well-established company they are currently going through massive growth and are looking to expand their engineering functions. 
They specialise in business transformations, usually across the financial, healthcare and public sectors. A large focus of this role is going to be working on the end-to-end development of Greenfield data environments. This will involve building data warehouses and data lakes within the cloud (either AWS or Azure), to help customers understand the value of their data and drive a data-driven environment. 

  • Experience of working on the end-to-end development of large-scale data processing solutions, taking responsibility for non-functional needs of ETL/ELT data processing pipelines. Focussing on the design, code, test defect resolution and operational readiness, and includes setting the standards for these activities.
  • Software development experience with distributed data storage and processing technologies including Hadoop and Spark and using JVM languages.
  • Collaborate with architects and other stakeholders on detailed technology and development practice and on
  • development estimates.
  • Ability to make effective decisions within fast-moving Agile delivery and to lead on troubleshooting.
  • Development experience with Cloudera’s distribution of Apache Hadoop and with Python.
  • Experience of complex data transformations, and data visualisation including ETL tools such as Talend.
  • Understanding of text processing including Natural Language Processing and machine learning models.
  • Experience with steaming technologies such as Kafka and change-data-capture products.
  • Data modelling experience with data storage technology, such as document, graph, log stores and other non-
  • relational platforms.
  • Experience with AWS, Azure or Databricks.