Back to jobs
Senior Data Engineer
Job description
Senior Data Engineer – Remote (US) - Up to $140,000
AWS, Azure, Databricks, Python, ETL, ELT, CI/CD, SQL, Spark, Hadoop, Talend, Data lakes, Data warehouse.
We are working closely with a modern software company that is looking to hire a Senior Data Engineer. You will be working in a cross-functional team who are completely data-driven, looking to use the data platform to help the company grow at rapid rates.
Requirements
AWS, Azure, Databricks, Python, ETL, ELT, CI/CD, SQL, Spark, Hadoop, Talend, Data lakes, Data warehouse.
We are working closely with a modern software company that is looking to hire a Senior Data Engineer. You will be working in a cross-functional team who are completely data-driven, looking to use the data platform to help the company grow at rapid rates.
Requirements
- Experience in working on the end-to-end development of large-scale data processing solutions, taking responsibility for non-functional needs of ETL/ELT data processing pipelines. Focussing on the design, code, test defect resolution, and operational readiness, and includes setting the standards for these activities.
- Software development experience with distributed data storage and processing technologies including Hadoop and Spark and using JVM languages.
- Mentoring experience in data development.
- Collaborate with architects and other stakeholders on detailed technology and development practice and on
- development estimates.
- Ability to make effective decisions within fast-moving Agile delivery and to lead on troubleshooting.
- Development experience with Cloudera’s distribution of Apache Hadoop and Python.
- Experience of complex data transformations, and data visualization including ETL tools such as Talend.
- Understanding of text processing including Natural Language Processing and machine learning models.
- Experience with streaming technologies such as Kafka and change-data-capture products.
- Data modeling experience with data storage technology, such as document, graph, log stores, and other non-
- relational platforms.
- Experience with AWS, Azure or Databricks.