This is an awesome opportunity for you the Data Engineer – Scala, Spark, AWS, Python - $120-$160K to join a fast paced and fast growing organization who are redefining their industry.
You will be responsible for building and improving Data Pipeline and reporting infrastructure.
You will be required to manage multiple projects. You might work to improve optimization algorithms, write code using Apache Spark, participate in architectural decisions for new components, or improve performance in collaboration with DevOps. In the same week, you could work on user-facing interfaces and reports with front-end developers, write code to import, process and QC terabytes of new data, and work with analysts and statisticians to ensure the validity of our processes.
Are you the right person?
The skills you will need:
It would great if you also had:
Please apply now to find out more!
Where should we send our newsletter?