Large legacy LAMP environment in the FX sector little to no handover, 2 months in to the contract. There was a need to have up-to-the-second balance information for clients, with these complications:
- Balances in at least 2 perhaps up to 30 currencies simultaneously
- There was no journal/ledger system in place so calculations needed to take into account Txns from all time for each ReCalc
- There was no one point in the PHP application where every Txn type that could change a balance would pass though (around 5 classes and 70 programs).
- There were only 2 tables in the SQL DB where every Txn type that could change a client's balance would pass through.
The code to calculate already existed (in PHP).
- A Batch run each night (taking 2-3 hrs) would place us in a 95% position for the next day, but the last 5% was where 50% of the data necessary for next day would be.
- My first attempt to target every event at application level (php code) failed since an exhaustive list was not available and new places where continuosly being found. There was also the issue of increased page load latency.
- My next attempt was to use MySQL triggers to target UPDATE and INSERT events on the 2 DB tables, but this had to trigger a php level event and need to return asynchronously.
In the end it went some thing like this:
- MySQL Trigger called a Proc
- The Proc used sys_exec to call a bash script passing args on the CLI
select sys_exec(concat("/APPLIVEv3/SHELL/statement.calc.sh ",accountnoX," TriggerStatementCalcFromDealID/",triggerX,"/",dealid)) into tmpX;
- This then used wget to call the PHP event
wget $BOHTTPPATH../ot/statement.calc.php?accountno=$1 -O - 2> /dev/null &
The end result solved the problem and the additional issue of latency in the PHP app. This system has now been running largely unchanged for over 3 years.