Senior Data Engineer | Python | Apache Spark | AWS | Team SEARCH (f/m/d)
Direkt auf dem OTTO-Campus in Hamburg die Zukunft des E-Commerce vorantreiben und den E-Commerce-Spirit von OTTO und die Innovationskraft unseres Konzerns erleben.
Die Zusammenarbeit aller Teams macht otto.de täglich erfolgreich und steuert in den Otto Group Funktionsbereichen 123 nationale und internationale Einzelgesellschaften.
Das klingt spannend? Dann starten wir gemeinsam durch.
Drive the future of e-commerce - right here on the OTTO Campus in Hamburg. We invest in new technologies, and in the further development of our platform otto.de. We live our passion daily in agile teams. Be part of the future in which we work together. Are you ready for a change? Then apply now!
At Team Search we ensure that our 8 million customers get the best matching search results. To live up to the expectations we use various data science models to find the best products for the customer's query. Finding those products in the over 1.2 million products is a challenging task for which we need your expertise.
Our Tech Stack:
Python | Apache Spark | AWS | Terraform | Linux
Was Dich erwartet:
Design and develop high performance data pipelines
Build automated, scalable and distributed systems in our cloud infrastructure
Be a quality advocate in the team and set a good example by writing clean, readable and maintainable code
Be part of a crossfunctional team making use of Agile Methodologies such as Kanban, Pair Programming, and Test-Driven Development
Do you have any questions? - We can arrange the contact with Joscha Harpeng from Team SEARCH. Would you like to meet your future team and experience your workspace live? You will get the chance within our application process at a later stage within our process.
Was Du mitbringen solltest:
Many years of expertise in development and operation of data pipelines in a complex decentralized infrastructure
Expert know-how in cloud environments and containerisation (e.g. AWS, Docker)
Profound knowledge in designing data structures (persistence and transformation)
Software development skills (Python, Bash, Git, Docker, CI/CD)
Hands-on experience with "infrastructure as code” (e. g. AWS CloudFormation, Terraform)
A strong desire to pair-program!
Experience in working with large amounts of data with Apache Spark