Rihab B.
Data Engineer
Rihab est un ingénieur de données avec plus de 7 ans d'expérience dans des industries réglementées telles que la vente au détail, l'énergie et la fintech. Elle possède une solide expertise technique en Python et en AWS, ainsi que des compétences supplémentaires en Scala, en services de données et en solutions cloud.
Outre ses compétences techniques, Rihab possède une vaste expérience en matière de leadership et de gestion de projets. L'une de ses principales réalisations est la mise en place d'un service de curation de données tout en exerçant la fonction de Scrum Master, où elle a géré avec succès une équipe et mis en œuvre un nouveau service de données à l'aide de Scala.
La combinaison de compétences techniques solides et d'expérience en matière de leadership fait de Rihab une candidate idéale pour les projets dans les secteurs réglementés.
Principale expertise
- AWS S3 5 ans
- ETL 5 ans
- MLOps 2 ans
Autres compétences
- Tableau 2 ans
- Machine Learning 2 ans
- Snowflake 1 ans
Expérience sélectionnée
Emploi
Senior Data Engineer
Data4Geeks - 1 an 10 mois
Design & Implementation of a Forecasting Platform - Engie (French Global Energy Company)
-
Designed and implemented a comprehensive forecasting platform tailored to the global energy sector;
-
Developed data pipelines using Python and PySpark, ensuring efficient and scalable data processing;
-
Orchestrated job workflows using Airflow and Databricks, optimizing task management and execution;
-
Implemented data engineering processes utilizing Databricks' Delta Live Tables (DLT) for robust data management;
-
Built and deployed data stream processing pipelines using DLTs, enabling real-time data processing capabilities;
-
Developed Feature Store APIs for interaction with components and created reusable templates to standardize processes;
-
Utilized MLflow to build, manage, and track experiments and machine learning models, ensuring rigorous experimentation;
-
Managed the lifecycle of ML models using MLOps techniques, implementing reusable templates for consistency and efficiency;
-
Created dashboards for data analysis and visualization, facilitating data-driven decision-making;
-
Developed APIs using .NET/C# to expose data, ensuring seamless integration and accessibility across systems;
-
Employed tools such as Databricks, PySpark, Python, R, SQL, Glue, Athena, Kubernetes, and Airflow to deliver a robust and scalable solution.
Les technologies:
- Les technologies:
- Machine Learning
-
Software Engineering Manager/Senior Data ENGINEER
Cognira - 6 mois
Building and supporting promotion planning demo solution
-
Developed generic data pipelines to transform raw client data into a format compatible with the data model of the promotion planning demo system;
-
Wrote scripts to generate meaningful business data, ensuring alignment with the needs of the application;
-
Collaborated with the science team to understand business requirements and determine the necessary data transformations to enhance data utility;
-
Designed and implemented a generic PySpark codebase that efficiently transforms data to fit the required data model;
-
Utilized tools such as PySpark, JupyterHub, Kubernetes, and Azure Data Lake to execute and support the project.
Les technologies:
- Les technologies:
- Azure Blob storage
-
AI/Data Engineer
Data4Geeks - 1 an 11 mois
Supporting Data Pipelines, Migrations, and Research on LLM Technologies Integration - Anant (R&D USA-Based Company)
-
Led projects focused on integrating Large Language Models (LLM) and AI technologies, driving innovation within the organization;
-
Assisted in designing and implementing data migration solutions, ensuring seamless transitions for various clients;
-
Developed integrations and clients for vector databases, leveraging different open-source AI tools to enhance capabilities;
-
Actively communicated with clients to gather requirements and ensure alignment with their specific needs;
-
Utilized tools such as Python, Google Cloud Platform (GCP), and Datastax to deliver robust solutions.
-
Senior Data Engineer
Data4Geeks - 2 années 9 mois
Implementing and Migrating Data Pipelines, and Supporting Legacy Systems - SumUp (Fintech German Company)
-
Designed and implemented data pipelines for both batch and stream processing, optimizing data flow and efficiency;
-
Explored and implemented data pipelines using AWS Glue and PySpark, ensuring scalability and robustness;
-
Integrated Delta Lake into the pipelines to enable delta processing, enhancing data management capabilities;
-
Developed job templating using Jinja to streamline the creation and management of data processing jobs;
-
Built and automated data validation pipelines, ensuring the accuracy and reliability of processed data;
-
Deployed and configured Trino to facilitate efficient data access and querying across various sources;
-
Prepared comprehensive documentation for each component and tool explored, ensuring knowledge transfer and easy maintenance;
-
Utilized tools such as Python, PySpark, Glue (Jobs, Crawlers, Catalogs), Athena, AWS, MWAA (Airflow), Kubernetes, Trino, and Jinja to achieve project goals.
-
Software Engineering Manager/Senior Data ENGINEER
Cognira - 3 années
Building a Data Curation Platform
-
Implemented a platform designed to make building data pipelines generic, easy, scalable, and quick to assemble for any new client;
-
Prepared detailed design documents, architectural blueprints, and specifications for the platform;
-
Gathered and documented requirements, creating specific epics and tasks, and efficiently distributed work among team members;
-
Developed command-line and pipeline functionalities that enable chaining transformations, facilitating the creation of generic data pipelines;
-
Supported the management of metadata for various entities defined within the platform;
-
Conducted runtime analysis and optimized the performance of different platform functionalities;
-
Studied scalability requirements and designed performance improvement strategies to enhance the platform's robustness;
-
Built a PySpark interface to facilitate seamless integration with data science workflows.
Les technologies:
- Les technologies:
- Azure Blob storage
- Scala
-
R&D Engineer
Cognira - 1 an 8 mois
Project 1: Building a Speech Recognition Solution
-
Developed a speech recognition solution aimed at transforming retailers' questions and commands into actionable tasks executed against a user interface (UI);
-
Utilized TensorFlow, Python, AWS, and Node.js to design and implement the solution, ensuring seamless interaction between the speech recognition engine and the UI.
Project 2: Design and Implementation of a Short Life Cycle Forecasting System
-
Prepared comprehensive design documents and conducted studies on existing AI solutions, with a focus on voice and speech recognition capabilities;
-
Collaborated with the team to prepare and collect relevant data for the project;
-
Executed the processes of data augmentation, validation, and transformation to extract essential information for forecasting purposes;
-
Contributed to building a user interface and integrated backend functionalities using tools such as TensorFlow, Python, AWS, JavaScript, Node.js, Scala, and Spark.
Les technologies:
- Les technologies:
- Machine Learning
- Azure Blob storage
- Scala
-
Software Engineering Manager/Senior Data ENGINEER
Cognira - 4 années 11 mois
Implementing Data Pipelines to support a Promotion Planning solution - Retailer based in Texas (USA)
-
Led the team in building data pipelines to support a retailer's promotion planning solution;
-
Participated in meetings with business and data science teams to understand and identify project needs;
-
Collaborated with the team to translate business requirements into actionable epics and stories;
-
Designed and implemented the identified business requirements, ensuring alignment with project goals;
-
Developed and executed unit tests to ensure the functional correctness of implementations;
-
Created a data loader application using Scala Spark to load data from Parquet files into Cosmos DB/Cassandra API;
-
Implemented an online forecaster API using Scala, Akka, and Docker to enable real-time promotion forecasting;
-
Managed the deployment of the project on the client’s Kubernetes cluster, ensuring smooth operation and integration;
-
Utilized tools such as Scala, Spark, Azure Databricks, Azure Data Lake, and Kubernetes to achieve project objectives.
Les technologies:
- Les technologies:
- Azure Blob storage
- Scala
-
Fullstack Data Scientist
Infor - 3 années 1 mois
-
Designed and structured the architecture for various components of a retail forecasting project;
-
Implemented and deployed key components, ensuring seamless functionality within the overall system;
-
Integrated all components, automating the processes and establishing an end-to-end batch process for streamlined operations;
-
Optimized the runtime and performance of each component, enhancing the system's overall efficiency;
-
Developed forecast comparison templates to facilitate the evaluation of forecast quality, aiding in accurate performance assessments;
-
Utilized Logicblox, Python, and Tableau Software to achieve project goals, ensuring high-quality results.
Les technologies:
- Les technologies:
- Tableau
- Data Science
- Machine Learning
-
Éducation
License ès sciencesComputer Science
National School Of Computer Science · 2011 - 2014
Trouvez votre prochain développeur en quelques jours et non sur plusieurs mois
Dans un court appel de 25 minutes, nous voulons:
- Comprendre vos besoins en développement
- Vous expliquez comment nous allons vous mettre en relation avec le développeur le mieux qualifié pour votre projet, sélectionné avec soin
- Vous indiquez nos prochaines démarches afin de vous trouver le meilleur développeur, souvent en moins d'une semaine