Automated Trading Platform - MT5 Integration & Data Pipeline
Turned an inconsistent manual strategy - swinging between -3% and +8% monthly - into a consistent automated system delivering 5-8% monthly returns across 18 symbols simultaneously, in 6 weeks.
The Problem
An independent trading firm had a strategy that was working - on paper. Traders were executing it manually, but the results were erratic: some months returned +8%, others -3%. Without statistical rigour underpinning the parameters, every month was a coin toss. The inconsistency wasn’t just uncomfortable - it made serious capital allocation impossible.
The brief was clear: automate it and make it consistent. But before writing a single line of execution code, the real work was understanding whether the strategy could survive automation. Manual trading allows for intuition and discretion - an algorithm doesn’t have that luxury. Parameters that felt right in practice needed to be validated statistically before trusting them with live capital.
The first challenge was finding the optimal parameters. The firm had a directional view and a rough set of rules, but no rigorous backtesting framework to quantify performance across different market conditions, symbols, or time windows. Without that foundation, any automated system would be flying blind.
Our Approach
Before writing any execution code, we built the analytical foundation first.
We extracted historical OHLCV data going back to the early 2000s across all 18 symbols and built a backtesting framework from scratch using Pandas and NumPy. Rather than testing a single set of parameters, we ran exhaustive grid searches across the full parameter space, evaluating strategy performance for every combination across each symbol, each month, and each day of the week. The goal was not to find parameters that looked good on average, but to find the optimal configuration per symbol - because what works on EUR/USD does not necessarily work on a commodity or index.
The analytics we built around this process were as important as the backtesting itself. We needed to be able to explain why a parameter set was optimal, not just that it was. Win rate, Sharpe ratio, maximum drawdown, and monthly P&L distributions were tracked per symbol and per parameter combination, giving us a clear statistical picture of where the strategy was robust and where it wasn’t.
Critically, the system was not built as a black box. Traders retained full visibility and control - a monitoring layer gave them real-time insight into bot behaviour, open positions, and live performance metrics. Parameters could be adjusted per symbol without touching the codebase, allowing the team to fine-tune the strategy as market conditions evolved without depending on an engineer to make changes.
Once we had validated parameters for each symbol, we automated the strategy and deployed it on paper trading - fictitious capital with live market conditions. This phase was deliberately unhurried. We were watching for two things: whether the bot was behaving exactly as the backtests predicted, and whether the live performance metrics were consistent with what the historical analysis had told us to expect. Any divergence between live behaviour and backtested expectations was investigated before moving forward.
Only once the paper trading phase confirmed the bot was performing as modelled did we consider the system ready for the next stage.
What We Built
The system is built around a clear separation of concerns: real-time execution on one side, historical analysis and monitoring on the other - connected by a Kafka backbone that ties everything together.
Kafka sits at the centre of the architecture. Tick data flows in from MT5 in real time - price, spread, swap rates, and symbol-level metrics - and is consumed by two independent paths simultaneously. The first path feeds the live execution engine, where the trading bot reads the stream, evaluates the current market state against the optimised parameters, and fires orders back through MT5. The second path feeds the storage and analytics layer, ensuring every tick is captured for validation, replay, and ongoing analysis. Execution latency runs consistently under 10 seconds, well within the tolerance of the strategy, which operates on 30-minute candles (M30) where entry precision at the millisecond level is neither required nor meaningful. This was a deliberate architectural choice, not a constraint.
Data is persisted in two stores depending on how it needs to be accessed. A warehouse-style relational database holds the structured historical data - OHLCV, trade records, performance metrics - optimised for analytical queries and backtesting runs. A NoSQL store handles the data that needs fast retrieval - live position state, real-time metrics, current parameter configurations - where low latency matters more than query flexibility.
Every day a compaction and aggregation job runs automatically, consolidating raw tick and trade data into clean analytical datasets. This keeps the storage layer manageable at scale and feeds the monitoring dashboards with up-to-date performance summaries - win rate, Sharpe ratio, drawdown, and P&L broken down by symbol, day, and time window.
Live metrics are streamed directly from Kafka, giving traders a real-time view of bot behaviour without polling the database. Open positions, execution latency, and strategy signals are visible as they happen.
The bot itself runs on MT5, which also serves as the data source for market metrics - spreads, swap rates, and symbol-level data that informed both the backtesting parameter optimisation and the ongoing monitoring of live conditions.
The entire system is containerised with Docker and deployed on-premise. A CI/CD pipeline using GitHub Actions manages strategy deployments and parameter updates - meaning the trading team can push a new parameter set through a review process and have it live without manual intervention or downtime. Version control on strategy configurations means every change is tracked, reversible, and auditable.
The Results
The numbers validated what the backtests had predicted - and then some.
Before the system existed, the strategy was swinging between -3% and +8% month to month, driven by parameters chosen on instinct rather than evidence. That variance wasn’t just uncomfortable - it meant the firm couldn’t confidently size positions or plan around returns. The strategy had edge, but it was being left on the table.
After optimisation and automation, live performance settled into a consistent 5–8% monthly return across the 18 symbols - a range narrow enough to plan around and strong enough to compound meaningfully. That consistency is the real result. Any single month’s headline number matters less than a strategy that behaves the same way whether markets are quiet or volatile.
Live performance on the paper trading account - deployed one month after the initial 6-week build - was on par with what the backtests had predicted. That alignment between historical simulation and live behaviour is the real validation. It means the model wasn’t overfit to historical data, and the execution infrastructure was faithfully implementing what the strategy logic intended.
The monitoring and parameter tuning layer proved its value immediately. From the first weeks of paper trading, the traders were actively using it - adjusting parameters, observing the impact in real time, and iterating. Several optimisations were made collaboratively during the paper trading phase that wouldn’t have been possible without that visibility. The system gave them ownership of the strategy, not just access to it.
The project delivered in 6 weeks. They’re now running 18 symbols simultaneously - something that was simply not possible manually - and are actively adding more, using the backtesting and validation tools we left in place to test each new symbol before going live. When your bot is managing 18 symbols around the clock without missing a signal, it’s hard to imagine going back to doing it by hand.
"Working with AzCoding was an incredible experience. Alex automated our trading strategy with real professionalism and great energy, and the deep backtesting alongside the detailed data analysis gave us a level of clarity that we never had before. It helped adjust key elements and ultimately make the strategy more robust and performant. I highly recommend working with him!"
Start a similar project
Tell me where your data stack is today and where you need it to go. Free discovery call, no commitment.
Get in touch