Your browser is outdated!

To ensure you have the best experience and security possible, update your browser. Update now

×

Hamza AMEUR

Machine Learning & Data Engineer

6 years of experience
3 industries
7 certficates
Employed Open to opportunities
As a Machine Learning & Data Engineer with 7 years of experience, my work consists of designing and implementing data-driven solutions to help companies leverage the power of their data.

Practical information:

  • Age : 30 years
  • Nationality : French
  • Residency Status : B Permit holder
  • Years of experience : +7 years
  • Highest academic degree : Master Of Science in Information Technology
  • Within Outshift by Cisco, Cisco's incubation engine for emerging technologies, my role is to combine my ML and Software Engineering skills to identify and build agentic AI use cases relevant for Cisco.
  • Mission 1 : design, implement and maintain a multimodal RAG solution for one of Outshift's GenAI products. Skills : python, PGVector, AWS
  • Mission 2 : design and implement a multi-agentic system to help network engineers validate their network configuration changes in simulation and emulation environments before deploying in production. Skills: python (Langgraph, pyATS), Batfish, Cisco Modeling Labs
  • Mission 3 : implement key components in an OSS agentic observability stack. Skills : python, opentelemetry, Clickhouse
  • Within UBS' Group Compliance, Regulatory and Governance, my role is to build data-driven solutions for non-financial risk management.
  • Mission 1 : implement Databricks workflows to monitor KPIs for Compliance Risk AI models. Tools: python (pandas, pyspark), Gitlab, Databricks
  • Mission 2 : refactor AI model's data sourcing using an event-based paradigm. Tools : python (Airflow) Kafka
  • Within the Datalab team of the Strategy & Development department, I worked on several Data & ML projects. Work ranges from proof of concept (PoC) to building data pipelines and deployment in production. Following are a few examples of missions I worked on.
  • Mission 1: Recommender System. Implementation of a recommender system that suggests companies with high default risk. The algorithm is based on criteria of similarity with the companies already reviewed by the risk underwriters. Tools: Python (pandas, sklearn, pySoT), Gitlab, optimization, Surrogate Modeling, Gower Distance
  • Mission 2: Complementary insurance. Automating of product profitability reporting, analyzing customer behavior and segmentation, predictive analysis of price sensitivity. Tools: Python (pandas, sklearn, airflow), Gitlab, Airflow, Docker
  • Mission 3: Balance sheet forecast following the Covid crisis. Modeling the impact of the Covid crisis on company balance sheets in order to adapt the Risk underwriting strategy. Tools: Python (pandas, pymongo), Gitlab
  • Mission 4: REST API for an external client.
    Design and implementation of REST API for a customer in the banking sector that allows them to search and identify a company, buy default risk scores and payment behavior insights. Tools: Python (Flask, pandas, cx-oracle, sqlalchemy), Docker, Kubernetes, AWS (API Gateway), Elastic Search, Oracle, Gitlab
  • Within the Machine Learning & Data Lab team, I contributed to the development of the AI offer. I carried out internal R&D work and participated in missions with external clients.
  • Mission 1 : review of the state of the art of Deep Reinforcement Learning with an application to autonomous driving. Tools: python, tensorflow, keras, pytorch, opencv, torcs, airsim.
  • Mission 2: implementation of a scanned contract processing tool for a player in the pharmaceutical industry. Tools: python, jupyter notebook, tesseract ocr, nltk, tensorflow, glove.
  • Within the "Centre de Compétences d'Informatique Cognitive" (Skill Center of Cognitive Computer Science), I carried out POCs (prrofs of concepts) on the use of NLP to analyze customer verbatims. Tools: Python, NLTK, Gensim, Keras, Tensorflow, Pandas.
  • I contributed to the development of conversational agents intended for Orange employees. Tools: IBM Watson, dialogflow (Ex api.ai) from Google, python, keras, tensorflow, pygal.
  • Within the Signal & Communications department, I carried out research work on Spectral Clustering algorithms which resulted in the publication of a conference paper.
  • My mission was to implement a similarity function (kernel) for these algorithms and evaluate its performance. Tool: matlab.
  • Link for the paper : https://ieeexplore.ieee.org/document/8450769