Projects
Professional Scale (At an organisation)
Payments Modernisation -- [Large UK-based Financial Services Organisation]
Enhanced digital payments by developing a cloud-native platform designed for secure and fast transactions.
Prioritised resilience and compliance to meet regulatory demands, ensuring the platform could scale effectively amid the growing digital payments landscape.
The solution was built to support robust transaction processing, providing a secure environment while maintaining compliance with industry standards and regulations, ultimately improving overall system reliability and performance in a dynamic market.
Fraud Data Analytics -- [Large US-based Financial Services Organisation]
Enhanced the connectivity across the data ecosystem to enable seamless, secure, and swift data operations, ensuring smooth integration between various components of the architecture.
Developed an elegant solution that efficiently linked Kafka streams with Python-based data ingestion, optimising the process for real-time data processing.
This solution improved the speed and efficiency of data ingestion while bolstering the system's dependability by reducing latency, ensuring reliable data flow, and maintaining high data integrity across the pipeline.
AIMS, Hermes & CN Shipping Squad -- [Large UK-based Energy Organisation]
Enhanced operations by developing and maintaining three core applications: AIMS, Hermes, and Casualty Notifier.
AIMS includes the Advisor Portal for terminal assessments and vessel inspection tracking, and the Requestor Portal for raising inspection requests.
Hermes serves as the source of truth for data on locations, organisations, vessels, cargo, and deals.
Casualty Notifier ensures prompt communication of casualties during vessel journeys, notifying relevant parties in real-time. These applications streamline processes, improve data access, and enhance operational efficiency.
Personal Interest
Comprehensive, Real-World Solutions
An advanced hybrid framework designed to revolutionise operational efficiency in urban bike-sharing services by integrating Deep Reinforcement Learning (RL) with autonomous Agentic AI, creating a fully optimised, self-regulating ecosystem.
Key Features & Methodology:
- Intelligent Redistribution: RL model predicts future demand from real-time bike usage patterns, optimising station-level redistribution decisions.
- Autonomous Logistics: Agentic AI coordinates with operational staff and drivers, dispatching tasks and optimising routes without manual intervention.
- System Integration: Seamlessly connects with existing infrastructure, leveraging real-time data streams for adaptive operations.
Impact & Outcomes:
- Improved bike availability and reduced wait times, boosting customer satisfaction.
- Optimised routes lowered fuel usage and operational costs.
- Showcases the synergy of predictive ML with autonomous execution for complex urban mobility challenges.
Skills: ※Reinforcement Learning ※Agentic AI ※Machine Learning ※Predictive Modelling ※Combinatorial Optimisation
Focused, Exploratory Projects
End-to-end demos of cutting-edge generative AI: image captioning, text-to-image with Stable Diffusion, and more.
Skills: ※Generative AI ※Stable Diffusion ※Image Captioning ※Transformers ※Streamlit
A library of deep-learning architectures (CNNs, RNNs, Transformers) with Jupyter demos comparing performance and resource trade-offs.
Skills: ※TensorFlow ※PyTorch ※Keras ※Jupyter Notebook ※Model Benchmarking
End-to-end NLP pipelines: tokenization, embeddings, sequence models, and transformer fine-tuning for text classification and generation tasks.
Skills: ※NLTK ※spaCy ※Hugging Face Transformers ※Seq2Seq Modeling ※Text Preprocessing
Developed collaborative, content-based, and hybrid recommenders using custom similarity measures to deliver more personalised and relevant recommendations.
Skills: ※Python ※Surprise ※Pandas ※Flask ※Similarity Metrics
A hands-on collection of ML workflows—data ingestion, feature engineering, model building, and evaluation—standardised via reusable pipelines.
Skills: ※Scikit-Learn ※Pandas ※MLflow ※Matplotlib ※Feature Engineering
Distributed data-processing pipelines in PySpark for large-scale model training and batch inference.
Skills: ※PySpark ※Hadoop ※Data Partitioning ※RDD & DataFrame APIs ※Batch Inference
Data-modelling patterns and aggregation pipelines in MongoDB, optimised for ML feature stores and real-time lookup.
Skills: ※MongoDB ※Aggregation Framework ※Indexing Strategies ※Schema Design ※Feature Store
Daily Python challenges covering core programming, automation, web scraping, and mini-projects—solidifying fundamentals.
Skills: ※Python ※Scripting ※Automation ※Web Scraping ※OOP
A curated library of ready-to-use code snippets—string parsing, data wrangling, web automation, and more.
Skills: ※Regex ※Pandas ※Shell Scripting ※Python One-Liners ※Browser Automation