Are you ready to take the challenge?
We’re looking for a Senior Data Engineer to join our team in Warsaw or remotely.
Sunscrapers is an elite Python development company that helps clients set up dedicated development teams in Poland, made of the most talented, experienced and motivated software engineers.
Since 2010, we’ve been working with ambitious US and European scaleups, SMBs and enterprises on delivering digital products and extending in-house development teams.
We’ve earned our reputation by combining passion for technology, highest quality standards and our unique ‘Product-Developer Fit’ hiring process.
As a Senior Data Engineer you’ll design and implement a system supporting the investment process for the US-based venture capital firm. You’ll need to integrate data from multiple systems and sources to enable data insights, machine learning and data-driven decision processes. You’ll build integrated data flows, data warehouse and data pipelines using :
- Technologies: Python, SQL, Pandas, NumPy, Shell scripts, Terraform
- Tools: Azure DevOps, Jupyter Notebook, Snowflake, DBT, Airflow, Docker, Kubernetes
- AWS: EC2, ELB, IAM, Lambda, RDS, Route53, S3
- Best Practices: Continuous Integration, Code Reviews, Scrum / Kanban
The ideal candidate will be well organized, eager to constantly improve and learn, driven and, most of all - a team player!
Your responsibilities will include:
- Designing and implementing system architecture and data flow,
- Integrating third-party systems (via APIs) and external data sources into data warehouse,
- Implementing integrations between systems to match and enrich data (e.g. CRM, Wiki and Slack),
- Designing and building AWS infrastructure with best DevOps practices in mind,
- Developing data technology stack including data warehouse and ETL pipelines,
- Building data flows for fetching, aggregation and data modeling using batch pipelines,
- Designing datasets and schemes for consistency and easy access,
- Building and extending recommendation systems.
What's important for us?
- At least 5 years of professional experience in data-related role, software engineering or DevOps
- Undergraduate or graduate degree in Computer Science, Engineering, Mathematics, or similar
- Excellent command in spoken and written English, at least C1
- Expertise in Python, SQL and Linux scripting
- Experience in designing schemes in data warehouses
- Experience in building ETL processes and data pipelines with platforms like Airflow or AWS services (Glue, Lambda, Batch, Data Pipelines).
- Expertise in AWS stack, Docker as well as deployment automation
- Great analytical skills and attention to detail - asking questions and proactively searching for answers
- Creative problem-solving skills
- Great customer service and troubleshooting skills
You will score extra points for:
- Experience with infrastructure-as-code tools, like Terraform
- Familiarity with Azure DevOps stack
- Proficiency in statistics and machine learning, as well as Python libraries like Pandas, NumPy, matplotlib, seaborn, scikit-learn, etc
- Experience in operating within a secure networking environment, like a corporate proxy
What do we offer?
- Working alongside a talented team that’s changing the image of Poland abroad.
- Flexible working hours and remote work possibility.
- Comfortable office in a penthouse in central Warsaw equipped with all the necessary tools to conquer the universe (Macbook, external screen, ergonomic chairs).
- Fully equipped kitchen with fruit, hot and cold drinks.
- Multisport card & Private medical care.
- Culture of good feedback: evaluation meetings, mentoring.
- We value and appreciate our engineers eagerness to learn and improve so we strongly encourage and support their growth!
Sounds like a perfect place for you? Don’t hesitate to click apply and submit your application today!