Skip to content

Data Engineer (PySpark&Airflow)

  • Remote, Hybrid
    • Kraków, Małopolskie, Poland
  • IT

Job description

We are #VLteam - tech enthusiasts constantly striving for growth. The team is our foundation, that's why we care the most about the friendly atmosphere, a lot of self-development opportunities, and good working conditions. Trust and autonomy are two essential qualities that drive our performance. We simply believe in the idea of ​​“measuring outcomes, not hours”. Join us & see for yourself!

Job requirements

Data Engineer (PySpark&Airflow)

Full job description - Link


Required skills:
Senior: Python

Regular: PySpark

Regular: Airflow

Regular: Docker

Regular: Kubernetes

Regular: xgboost

Regular: Pandas

Regular: Scikit-learn

Regular: Numpy

Regular: GitHub Actions

Regular: Azure DevOps

Regular: Git @ GitHub

What we expect in general?

  • Hands-on experience with Python.

  • Proven experience with PySpark.

  • Proven experience with Data Manipulation libraries (Pandas, NumPy, and Scikit-learn)

  • Regular-level experience with Apache Airflow.

  • Strong background in ETL/ELT design.

  • Regular-level proficiency in Docker and Kubernetes to containerize and scale simulation platform components.

  • Ability to occasionally visit Krakow office.

  • Good command of English (B2/C1)

or