Data Platform Engineer
Remote- Kraków, Małopolskie, Poland
Job description
As a Data Platform Engineer in our team, you'll play a crucial role in designing, implementing, and deploying Data Platforms for our customers. We seek an experienced software engineer skilled in developing large-scale data pipelines/systems, with strong problem-solving abilities and experience at vast scales. You'll lead innovation, swiftly test and validate new concepts, and integrate them, ensuring team collaboration.
Job requirements
Data Platform Engineer - Expert
Required skills:
- Experience in Data Platform Engineering and Big Data (Expert)
- Java/Scala/Python (Expert)
- Apache Beam (Expert)
- Apache Spark (Expert)
- Cloud AWS/GCP (Senior)
- Kubernetes (Senior)
- SQL (Senior)
- Kafka/BigQuery/Airflow(Nice to have)
- English (Expert)
What we expect in general:
- 8+ years of Software Engineering experience in data platform/big data software, with a proven track record of delivering highly scalable and efficient solutions
- Substantial experience with Java 8+ (preferred), Scala or Python
- Experience with streaming/data processing technologies such as Beam, Spark, Kafka, Airflow, HBase, Presto
- Experience in building enterprise-grade software in a cloud-native environment (GCP or AWS) using cloud services
- Experience in system architecture and design
- Familiarity with designing CI/CD pipelines with Jenkins, Github Actions, or similar tools
- Experience with Kubernetes using GKE/EKS
- Experience with SQL, particularly performance optimization (nice to have)
- Experience with Graph and Vector databases or processing frameworks (nice to have)
- Bachelor’s degree in Computer Science, Software Engineering, or related field
- Excellent communication skills and a pragmatic approach to problem-solving
We do not expect you to qualify for all of the above points. A good understanding of some of these areas and a willingness to develop expertise in others may be sufficient.
or
All done!
Your application has been successfully submitted!