Key responsibilities
- Build scalable data pipelines for structured and unstructured data
- Integrate APIs, databases, and streaming systems
- Develop and run ETL/ELT processes (cleansing, normalization, enrichment)
- Model data for OLTP, OLAP, and data lake architectures
- Operate modern data platforms (Snowflake, BigQuery, Delta Lake)
- Orchestrate workflows with Airflow, Dagster, or Prefect
- Optimize performance (partitioning, caching, query optimization)
- Ensure security, encryption, and GDPR compliance
Requirements
- Experience in data engineering, analytics engineering, or backend engineering
- Strong skills in SQL and Python, as well as Spark/Kafka/Flink
- Knowledge of modern data warehouse and lakehouse architectures
- Experience building ETL/ELT processes
- High quality mindset and a structured way of working