Detalles de la oferta

Are you seeking new challenges and a place where you can enjoy a close-knit environment while constantly learning ?
Welcome to a team with a clear purpose: "TRANSFORM people's lives by being the most reliable technology ally"!
Get ready and join this adventure!
What Will You Find?
Technical and personal challenges that will keep you in constant growth.
A connected team focused on your physical and mental well-being .
Culture of continuous improvement , fresh and collaborative, with learning opportunities and people willing to support you.
KaizenHub , a programme designed to enhance your talents, with feedback , mentoring, and coaching through Sofka U .
It'll be both a challenge and a game!
Programmes like Happy Kaizen and WeSofka that look after your physical and emotional well-being.
What Are We Looking For?
We are seeking a skilled Azure Data Bricks Engineer with expertise in designing, implementing, and optimising big data solutions using Azure Databricks.
This professional will be responsible for leveraging data to drive insights and innovation, ensuring efficient data processing and integration across cloud environments.
Key Responsibilities: Design and deploy scalable data pipelines using Azure Databricks .
Integrate data from a variety of sources, ensuring seamless data flow .
Perform data analysis and transform raw data into business-ready formats.
Optimise data processing workflows for better performance and cost-efficiency.
Ensure high data quality and implement data governance best practices.
Collaborate with cross-functional teams to align data strategies with business objectives.
Troubleshoot and resolve issues related to data ingestion and processing.
Document processes, architecture, and system configurations.
Technical Requirements: Proven experience with Azure Databricks , PySpark , and Big Data technologies.
Strong understanding of Azure services such as Data Lake Storage and Synapse Analytics .
Proficiency in programming languages like Python or Scala.
Familiarity with data modeling, ETL processes, and data warehousing concepts.
Experience with CI/CD pipelines for automated deployments.
Knowledge of data security protocols and practices.
Ability to optimise and troubleshoot complex data systems.
Exposure to Machine Learning models and their integration is a plus.
Advanced English level (B2+) is a must!
Apply and Be Part of Our Story!
Conditions Permanent contract - We aim for long-term relationships and for you to be part of our family for a long time!
Looking for professional growth?
You can design your own career plan in line with your aspirations!


Salario Nominal: A convenir

Fuente: Talent_Ppc

Requisitos

Performance Qa Software Tester/Software Engineer In

**Company Description** Hitachi Solutions is a global Microsoft solutions integrator passionate about developing and delivering industry-focused solutions t...


Hitachi Solutions Ltd - San José

Publicado a month ago

Software Engineers

**Key Responsibilities**: - Work with SQL databases to design, develop, and optimize SQL queries and stored procedures. - Collaborate with cross-functional ...


Crg Solutions - San José

Publicado a month ago

.Net Developer - Remote Work

Who We are BairesDev is proud to be the fastest-growing company in America. With people in five continents and world-class clients, we are only as strong as...


Bairesdev - San José

Publicado a month ago

Senior Site Reliability Engineer

A U.S.based company that is on a mission to develop the largest online marketplace and media platform in the world is looking for a Senior DevOps/SRE Enginee...


Nucleus Health - San José

Publicado a month ago

Built at: 2025-01-10T13:27:46.920Z