Since 1998, we've been active in the Human Resources consulting market, providing regional coverage across four key areas of expertise: recruitment and selection, personnel leasing, assessment centers and consultancy. As leaders in Transylvania, we've expanded our reach to embrace a culture of continuous improvement, thereby strengthening our position in the Romanian and also regional market. This commitment underscores our dedication to evolve alongside the dynamic needs of our clients and the ever-changing landscape of the business environment. Our success stems from the professionalism of our services, the multidisciplinary expertise of our consulting team and our ongoing collaboration with those who rely on our consultancy services. Building long-term partnerships with clients across diverse industries such as IT&C, automotive, outsourcing, pharma, banking, FMCG and more, is our primary objective. Our commitment to client orientation, teamwork, flexibility, excellence, dedication and responsibility reflects our aim to bring added value to our services. Role description: •Develop and provide solutions on Big Data; • Develop high-traffic, flawless web applications using Python, Cloud platform, and PySpark; • Code with performance, scalability, and usability in mind; • Work on new tools in leading industry trends, with new and emerging technologies, prototypes and engineering process improvements; • Migrate existing pipelines developed in ADF and python scripts from On-premise to Cloud using ADF and Databricks; • Develop new pipelines using ADF for data extraction from multiple sources (MySQL databases; • Oracle databases, CSV files) and Databricks (PySpark & SparkSQL) for curation, validation and applying required business transformation logic; • Work on client projects to deliver Microsoft Azure based Data engineering & Analytics solutions; • Engineer and implement scalable analytics solutions on Azure. Qualifications: • Bachelor's degree in computer science or related field • 5+ years of experience in a similar role; leading fellow data engineers within the data engineering area; • Extensive experience working in agile project environments; • Working experience with Python or PySpark is required; • Working experience and understanding of Azure Data-bricks, Azure Data Factory, Azure Data Lake, Azure SQL DW, and Azure SQL is required; • The ability to apply such methods to solve business problems using one or more Azure Data and Analytics services in combination with building data pipelines, data streams, and system integration; • Experience in driving new data engineering developments (e.g. applying new cutting-edge data engineering methods to improve the performance of data integration, using new tools to improve.
DataBricks Engineer in Sibiu
Contact
Datele de contact vor fi vizibile dupa ce veti aplica!