Data Engineer
• Mid-Level
• Remote
• Data
Mark status as:
✨ The Role in One Sentence
We are seeking a Data Engineer to develop and operate our central data infrastructure, ensuring our platform is data-driven, performant, and reliable.
📋 What You'll Likely Do
30%: Develop robust ETL pipelines with Python and orchestrate them with Apache Airflow.
30%: Operate productive workloads on Azure, ensuring stable operation through monitoring and optimization.
20%: Model relational data structures in PostgreSQL for efficient data storage and querying.
🧑💻 Profiles Doing This Job
High Priority: Experience in implementing ETL/ELT pipelines with Python.
High Priority: Knowledge of Apache Airflow or similar orchestration tools.
High Priority: Experience with Azure services like Batch, Queue, Blob Storage, and VMs.
📈 How This Role Will Look on Your CV
Developed and maintained data infrastructure in a fast-paced environment.