Dagster (nice to have) Airflow (nice to have) Git (nice to have) AWS (regular) Fivetran (regular) English (advanced) DBT (advanced) Snowflake (advanced) ETL (advanced) At Idego Group, you’ll work with people who find pleasure in programming and have deep knowledge about variety of technologies. We provide quality and give ourselves a lot of autonomy, common sense and general friendliness.In each & every project you’ll need to speak fluently in English and cooperate with quite a senior team of developers (from 6 to 15 years of experience) who are searching for new colleagues to join them :) The challenge is to build the data bridge between SAP & Snowflake ( AWS is basically a landing zone) - even if that doesn’t sound like a rocket science, those pipelines will be massive ( in storage 2-3 hundreds GB, daily GB estimated pipelines). We are currently setting up a POC of a favoured toolset. We will need to ingest the COPA module of SAP, probably using Fivetran and DBT, into a suitable data structure on Snowflake to build Datavault models and Power BI DAX models to support financial reporting. Tester for data bridge between SAP & Snowflake. The project consists of building a suitable data pipeline from SAP system to the enterprise data platform, which is a BI & data analytics reporting system. The internal customer of the new Data team will be the finance team. Working closely with the accountants is needed to actually determine what the data model should look like, how it should be interpreted, and how it should be documented. There is also a governance team that also helps with documentation. We want to build from scratch. Build a proper, robust data platform in the cloud that can scale up and down as things change. Because sometimes there won't be a lot of data, but at certain times, like before school starts or before the holiday season, the amount of data grows. You can say that the final product is a financial profitability report in Power BI and a financial model underneath, from which further reporting can be derived. Technical specification:to be familiar with Snowflake, DBT, and GIT (these are necessary)some optionals are AWS, Fivetranthe engineer would also be good if they’ve used orchestration like Airflow or Dagster Recruitment process:technical verification with Data Architectmeeting with the Team In terms of technology:The client has all three major cloud platforms, but this team mainly uses AWS and Azure.Azure is there for Power BI reporting, Analysis Services and Azure Database (for DAX).But the vast majority of the technology stack is tied to AWS. They don't use AWS too much as components, except for Lambda and S3 Buckets.DBT tool will be used for SQL transformation.Confluence for documentation.No decision has yet been made on a specific integration tool. It will be either HVR from FiveTran or SNP Glue from SAP. This role will not be involved much. What perks are waiting for you:Work environment with zero micromanagement – we cherish autonomy.100% remote work (unless you want to work from our HQ Gdynia), recruitment & onboarding.100% paid holidays (24 working days) paid leave (2 weeks) if you feel sick or dizzy.Really cool seaside apartments available for free for both leisure & work.Experienced team from 8 to 15+ years in commercial projects.Unique memes channel.Private medical insurance (basic dental services included) and Multisport.We want you to join our team. We are neither the agency giving you projects from time to time, nor huge corporation where you are a “dev XYZ”. At Idego – you matter!
Data Engineer in Bucuresti
Contact
Datele de contact vor fi vizibile dupa ce veti aplica!
Anunţ expirat