NOTE: PLEASE DO NOT CALL SEND YOU CV THROUGH WHATSAPP OR EMAIL ([email protected])

Senior AI / Machine Learning Engineer (GCP & MLOps)
Department: Data, AI and Intelligent Systems
Location: Jeddah / Riyadh, Saudi Arabia (Hybrid)
About Bupa Arabia:
Bupa Arabia is the leading health insurance provider in Saudi Arabia and the Middle East.
As a cornerstone of the Kingdom’s healthcare sector, we are driving a massive digital
transformation aligned with Vision 2030. At our Data & AI Center of Excellence, our purpose
is to embed enterprise-grade Artificial Intelligence into the core of our business. We
leverage the region’s richest healthcare datasets to autonomously detect Fraud, Waste,
and Abuse (FWA), predict clinical risks, automate complex medical document processing
(OCR/NLP), and deploy Generative AI to elevate the member experience.
The Opportunity:
We are seeking a highly skilled and motivated Senior AI/ML Engineer with deep, hands-on
expertise in building custom AI Models that can be deployed on Prem, and/or on Google
Cloud Platform (GCP) to bridge the gap between Data Science research and Enterprise IT
production.
In this role, you will be the architectural backbone of our AI practice. You will work across
diverse AI fields—from traditional predictive analytics to cutting-edge Large Language
Models (LLMs) and computer vision engines—deploying them into robust, highly available,
and secure microservices on Prem and on GCP. Whether building a real-time REST API to
intercept medical claims in milliseconds or orchestrating massive batch-scoring pipelines,
your work will directly optimize billions of Riyals in healthcare operations.
Key Responsibilities:
• End-to-End ML model Development along MLOps Pipelines: Design, develop, and
implement production-ready CI/CD pipelines on GCP, encompassing data
ingestion, feature engineering, model training, evaluation, and scalable
deployment.
• GCP AI Architecture: Leverage and orchestrate the full GCP data stack to build Bupa
Arabia’s AI infrastructure:
• Data & Features: Build robust data pipelines and Feature Stores using BigQuery and
Dataflow / Apache Beam.
• Model Training & Registry: Train and version control models using Vertex AI
Workbench and the Vertex Model Registry.


Confidential
• Deployment & Serving: Deploy low-latency real-time inference using Vertex AI
Endpoints and containerize lightweight deterministic rule engines using Cloud Run
or Google Kubernetes Engine (GKE).
• Orchestration: Schedule complex batch-scoring workflows using Cloud Composer
(Apache Airflow).
• Hybrid Cloud AI Integration: Experience designing architectures that securely bridge
on-premises data centers (Oracle/SQL) with GCP AI services using Cloud
Interconnect, Apigee API gateways, or secure REST endpoints.
• Data Anonymization & Security: Proven ability to build on-premises data masking
and tokenization pipelines (removing PHI/PII) before sending stateless inference
requests to cloud-based LLMs.
• GCP AI Architecture: Leverage and orchestrate the full GCP data stack to build Bupa
Arabia’s AI infrastructure:
• Data & Features: Build robust data pipelines and Feature Stores using BigQuery and
Dataflow / Apache Beam.
• Model Training & Registry: Train and version control models using Vertex AI
Workbench and the Vertex Model Registry.
• Deployment & Serving: Deploy low-latency real-time inference using Vertex AI
Endpoints, and containerize lightweight deterministic rule engines using Cloud Run
or Google Kubernetes Engine (GKE).
• Orchestration: Schedule complex batch-scoring workflows using Cloud Composer
(Apache Airflow).
• API & System Integration: Wrap machine learning models in secure, high
performance RESTful APIs (e.g., FastAPI/Flask) to integrate seamlessly with Bupa’s
core claims processing engines (e.g., Care Connect) and API gateways.
• Model Observability: Implement Vertex AI Model Monitoring to continuously track
data drift, concept drift, and training-serving skew, ensuring models adapt to
changing healthcare billing behaviors.
• Data Security & KSA Compliance: Architect AI solutions that strictly adhere to Saudi
Arabian data sovereignty and healthcare regulations (SAMA, CHI, NDMO, PDPL).
Implement Cloud DLP (Data Loss Prevention) and VPC Service Controls to
dynamically mask and secure Protected Health Information (PHI) and National IDs.
• Cross-Functional Collaboration: Partner closely with Data Scientists, FWA
Investigators, Medical SMEs, and Product Managers to translate clinical rules and
business requirements into scalable technical solutions.
What You'll Bring (Required):


Confidential
• Bachelor's or master's degree in computer science, Software Engineering, Artificial
Intelligence, or a related quantitative field.
• 3–5+ years of professional engineering experience, with a proven track record of
taking ML models out of Jupyter notebooks and deploying them into production
environments.
• Deep, hands-on mastery of Google Cloud Platform (GCP) for ML workloads is
essential.
• Strong proficiency in Python (OOP, modular design, unit testing) and relevant AI/ML
libraries (TensorFlow, PyTorch, scikit-learn, Pandas).
• Experience with backend API development frameworks (FastAPI, Flask) for high
throughput model serving.
• Strong DevOps fundamentals: Docker containerization, Git version control, CI/CD
tools (Cloud Build, GitHub Actions), and Infrastructure as Code (Terraform).
• Solid understanding of machine learning evaluation metrics (Precision, Recall,
ROC-AUC) and the ability to evaluate algorithmic trade-offs for specific business
problems.
Desirable Skills (Bonus Points):
o GCP Certifications: Professional Machine Learning Engineer or Professional Data
Engineer.
o Healthcare/Insurance Domain: Prior experience dealing with medical billing codes
(ICD-10, CPT, DRG), NPHIES interoperability standards, claims adjudication, or
Fraud, Waste, and Abuse (FWA) detection.
o Advanced AI: Experience deploying Large Language Models (LLMs), Agentic
workflows (LangChain, LlamaIndex, CrewAI), or Optical Character Recognition
(Google Document AI) systems in production.
o Localization: Familiarity with Arabic NLP and processing localized text datasets.
Why Bupa Arabia?
o Massive Scale & Impact: Your code will directly optimize healthcare spending,
ensure clinical patient safety, and shape the digital future of health insurance in the
Middle East.
o Innovative Environment: Work with cutting-edge cloud architecture and enterprise
tier compute resources. We invest heavily in giving our engineers the best tools.
To Apply:
Please submit your resume and a cover letter outlining your relevant GCP MLOps
experience and detail a machine learning system you successfully deployed
Back Next