1. Título da Vaga:
Data Engineer
2. Local do projeto:
Remoto
4. Tempo de duração do projeto:
12 meses
5. Data de início do candidato no projeto:
Junho
6. Idioma / nível (básico, intermediário, avançado, fluente):
Inglês Avançado/Fluente
7. Disponibilidade para viagens:
N/A
8. Certificações (descrever quais certificações o candidato deverá possuir):
N/A
9. Skill Eliminatório:
Be part of a product team responsible for designing, developing, and maintaining data products on Cloud platform. We are looking for an individual with strong passion to help develop the solution on cloud with high quality, reliability, and secure. Candidates are expected to work collaboratively with other team members, and adhere to agile development/DevOps practices (e.g. Test Driven Development, Pair Programming), CI/CD (Continuous Integration/Continuous Delivery), and coding standards.
Responsibilities include, but not limited to:
• Work with Snowflake or Azure, depending on demand
• Design and implement Data Engineering Solutions, including batch and streaming data pipelines on Cloud
• Design and implement the extraction, processing and storing of data from different sources
• Design conceptual, logical and physical data models for analytics solutions
• Collaborate with project teams, business customers and other partners in requirements gathering, solution design and product delivery
• Understand company and industry best practices, technical architecture, design standards, technology roadmaps and business requirements and ensure compliance with each of these
• Translate customer needs and processes into logical & physical models, data diagrams and automated workflows
• Explore & Learn new technologies according to technology roadmap and industry trends
• Engage in proof of concepts, technical demos and interaction with customers and other technical teams
Required Technical Skills:
• Work experience on Azure and Snowflake platforms
• Solid knowledge in SQL (queries and stored procedures)
• Solid knowledge in DBT (Data Build Tool)
• Experience ETL/ELT pipelines for Cloud (e.g. Azure: ADF, Functions, Event Hub, Azure DevOps)
• Have good understanding on Python Coding Language
• Familiarity with data modeling (e.g. Star and Snowflake schemas)
Nice to have:
• Working experience with Data & Analytics Projects and on-premises data pipelines (e.g. Nifi, SSIS)
• Experience with SQL optimization on SQL Server (indexes, query plan, database load strategies
• Working experience on integration and flow of data coming from different sources (e.g. SAP, SQL Server, Hadoop)
• CI/CD fundamentals (e.g. DevOps, Git)
• Exposure with visualization analytics tools (e.g. Power BI, Tableau)
• Exposure to big data event streaming solutions (e.g. Kafka)
• Basic understanding about SAP HANA, SAP tables
• Exposure to data replication tools (e.g. Qlik replicate, SLT)
• Experience with SCRUM or other similar Agile framework
Soft Skills:
• Have a good command of English
• Strong analytical and problem solving skill
• Good collaboration and team work
• Self-learner and proactivity
• Able to work independently with minimal supervision
• Adaptability and flexibility
Work Experiences (for experienced hiring or experienced contractors)
• 2+ year hands-on experiences for designing, developing, and maintaining quality programs for Web application on Cloud platform using Java with Spring Boot and Spring Frameworks
Education: Computer Science, Computer Engineer or Equivalent