Hybrid mode 3 days/week at our client's office (Bucharest)

8 hours per day (6 months to 1 year)

March 2025

Job Requirements 

  • Bachelor’s degree in Computer Science, Information Technology, or a related field;
  • More than 4 years of experience in data engineering, big data, or cloud-based analytics, with a strong preference for experience in banking or fintech sector;
  • More than 2 years of hands-on experience with Databricks (Workflows, Jobs, Notebooks, and Clusters), including Delta Lake, Spark Structured Streaming;
  • Expertise in Apache Spark (PySpark/Scala) for large-scale data processing and real-time analytics;
  • Advanced experience in SQL and Python/Scala, with experience in data transformations and ETL pipelines;
  • Strong understanding of Delta Lake architecture for data versioning, ACID transactions, and schema evolution;
  • Experience with Cloud Platforms, particularly Databricks on Azure;
  • Knowledge of Lakehouse architecture and designing scalable, secure, and cost-efficient data pipelines (is a plus);
  • Familiarity with streaming architectures (Kafka) is a plus;
  • Experience in MLOps and model deployment using MLflow in Databricks is a plus;
  • Ability to collaborate with data scientists, analysts, and business teams to understand data needs;
  • Strong problem-solving skills, with the ability to troubleshoot performance bottlenecks and optimize workloads;
  • Excellent communication skills to explain technical concepts to non-technical stakeholders.

 

 

Apply now »