Data Engineer
Remote- Kraków, Małopolskie, Poland
Job description
Job requirements
What we expect in general:
SQL - expert
Snowflake or any Data Warehouse solution- expert
Data modelling - expert
Python or any modern programming language - advanced
AWS / GCP - regular
dbt - regular
Terraform - nice to have
English - advanced
- Hands-on experience with Python or an equivalent programming language
- Strong engineering skills
- Relevant Bachelor's degree – preferably CS, Engineering/ Information Systems or other equivalent Software Engineering background
- Strong SQL skills
- 6+ years of experience as a Data/BI engineer
- Experience with data warehouse solutions (like BigQuery/ Redshift/ Snowflake)
- Experience with data modelling, data catalogue concepts, data formats, data pipelines/ETL design, implementation and maintenance
- Ability to work in an agile environment, partnering with team members and peers to find solutions to challenging problems with transparency
- Experience with AWS/GCP cloud services such as GCS/S3, Lambda/Cloud Function, EMR/Dataproc, Glue/Dataflow, Athena
- Experience with Airflow and DBT – Advantage
- Experience with data visualization tools and infrastructures (like Tableau/SiSense/Looker/other) – Advantage
- Experience with development practices – Agile, CI/CD, TDD – Advantage
- Experience with Infrastructure as Code practices – Terraform – Advantage
We do not expect you to qualify for all of the above points. A good understanding of some of these areas and a willingness to develop expertise in others may be sufficient.
or
All done!
Your application has been successfully submitted!