Data Engineer Scala: We’re looking for a skilled Data Engineer with hands-on experience in Scala, Apache Spark, and Airflow to build and maintain scalable data pipelines and support real-time and batch data processing across our platform.
Key Responsibilities:
- Design, develop, and optimize data pipelines using Scala and Spark.
- Orchestrate workflows and ETL processes using Airflow.
- Collaborate with data scientists, analysts, and backend engineers to support data needs.
- Ensure data quality, reliability, and performance in distributed systems.
- Monitor, troubleshoot, and improve existing data infrastructure.
Required Skills:
- Proficiency in Scala and Apache Spark for large-scale data processing.
- Experience with Airflow for workflow orchestration.
- Strong understanding of data modeling, ETL pipelines, and distributed systems.
- Familiarity with cloud platforms (e.g., AWS, GCP) and CI/CD tools is a plus.