Data Engineer @ VirtusLab
- Hands-on experience with Python (4+ years as a Data Engineer)
- Proven experience with data warehouse solutions (e.g., BigQuery, Snowflake)
- Strong background in data modeling, data catalog concepts, data formats, and data pipelines/ETL design, implementation and maintenance
- Ability to thrive in an Agile environment, collaborating with team members to solve complex problems with transparency
- Proficient with AWS/GCP cloud services, including: GCS/S3, EMR/Dataproc, MWAA/Composer
- Experience in ecosystems requiring improvements and the drive to implement best practices as long term process
- Experience with data migration from data warehouse solutions (e.g., BigQuery, Snowflake) to cost-effective alternatives is an advantage
- Familiarity in Iceberg Lakehouse architecture using Trino is a plus
- Familiarity with Starburst is a plus
- Experience with Infrastructure as Code practices, particularly Terraform is an advantage
Don’t worry if you don’t meet all the requirements. What matters most is your passion and willingness to develop. Moreover, B2B does not have to be the only form of cooperation. Apply and find out!
We are #VLteam – tech enthusiasts constantly striving for growth. The team is our foundation, that’s why we care the most about the friendly atmosphere, a lot of self-development opportunities, and good working conditions. Trust and autonomy are two essential qualities that drive our performance. We simply believe in the idea of “measuring outcomes, not hours”. Join us & see for yourself!
Project scope
Our client is a NASDAQ-listed company that provides a range of solutions to support Go-To-Market (GTM) strategies. They offer a comprehensive B2B database platform that enables sales and marketing professionals to identify, connect with, and engage qualified prospects effectively.
The core mission of our client is to equip every company with a complete, 360-degree view of their ideal customer, enhancing every phase of their GTM strategy and boosting their success in achieving business targets.
Tech stack
Python, Airflow, Big Query, Snowflake, AWS, GCP, Atlan, GitHub, Terraform, Jenkins, Tableau, Starburst, Trino, Apache Iceberg, Databricks
Challenges
- Delivering business value by creating robust monitoring systems that drive product improvement
- Auditing business products through pain point analysis and KPI verification to enhance performance
- Developing reliable Apache Airflow pipelines to support metrics-driven solutions
- Building an intuitive Metric Data Catalog accessible to all stakeholders
- Optimizing data warehouse architecture for cost-effective, scalable solutions
Team
This team is composed of three Data Engineers from VirtusLab and five Data Engineers from the customer side. The team also includes a Manager and Product Owner.
,[] Requirements: Python, Apache Airflow, Trino, Apache Iceberg, SQL, Apache Spark, AWS/GCP, DevOps, Cloud, Terraform, Starburst, Databricks Additionally: Building tech community, Flexible hybrid work model, Home office reimbursement, Language lessons, MyBenefit points, Private healthcare, Stretching, Training Package, Virtusity / in-house training, Free coffee, Bike parking, No dress code, Shower, Free snacks, Free beverages, Modern office, Kitchen.Kategorie
data
-
Podrobné informace o nabídce práce
Firma: VirtusLab Lokalita: Práce v Polsku Odvětví práce: data Pracovní pozice: Data Engineer @ VirtusLab Směnnost práce fulltime - 40 hours per week Nástup do práce od: IHNED Nabízená mzda: neuvedeno Nabídka přidána: 7. 3. 2025
Pracovní pozice aktivní
Zajímavé nabídky práce v okolí:
Práce Data Engineer @ VirtusLab: Často kladené otázky
👉 V jakém městě se nabízí nabídka práce Data Engineer @ VirtusLab?
Práce je nabízena v lokalitě Remote, Kraków, Kielce, Wrocław.
👉 Jaká firma nabírá na tuto pozici?
Tato nabídka práce je do firmy VirtusLab.