Jobbeschreibung
IONITY is a joint venture of the car manufacturers BMW Group, Ford Motor Company, Hyundai Motor Group, Kia, Mercedes-Benz AG and Volkswagen Group with Audi and Porsche, along with BlackRocks Climate Infrastructure Platform as financial investor.
We're looking for skilled Data Engineer to play a key role in shaping the future of electric vehicle charging! Your role is to support our business by designing, engineering, and developing our data management systems and processes on a modern cloud infrastructure.
As part of our core data team, you'll collaborate with a diverse range of internal and external stakeholders across the EV charging landscape. You'll be a trusted advisor, providing strategic and technical guidance on all aspects of IONITY's data lifecycle.
Responsibility:
In the data team, you will lead the design and development of our data platform as well as support all data-related activities as a technical expert.
In particular:
- You build, automate, and maintain production pipelines for data centered around EV charging in the cloud.
- You design the structure and schema of our central data lake and data warehouse.
- You work together with Data Scientists, Data Analysts and Business Stakeholders to implement data-based solutions that help growing our business.
- You advocate a data-driven mindset and involve yourself in developing our data strategy.
- A degree in Informatics, Mathematics, Statistics, or a related field or equivalent professional experience as a Data Engineer.
- Strong analytical thinking and problem-solving skills.
- Willingness to excel as a team and individually.
- Highly motivated and proactive, with a flexible, can-do attitude.
- Excellent communication and interpersonal skills, enabling you to effectively collaborate with diverse teams.
- Fluency in English.
Our main technical focus is on data pipelines, database systems, and Python coding skills.
- Excellent programming skills in Python and its data-related ecosystem.
- Strong database & SQL skills.
- Experience with message queues, batch and stream data processing.
- Knowledge of data modelling, processing, cleaning and schema consolidation.
- Experience in collaborative coding, version control systems and in building and deploying CI/CD pipelines.
- Proficiency in data communication APIs (e.g. REST), protocols (e.g. FTP, HTTP), and formats (e.g. JSON, CSV, parquet).
- Preferable: Knowledge of data-related AWS-Services (RDS, S3, Glue, Athena, SageMaker, …) and provisioning them as Infrastructure as Code.
- Preferable: Familiarity with Apache Airflow.