@freemo
how big is your data set?
@mur2501 not particularly large, 100,000 data points per stock (1 min price data for one year across 24 stocks, only for the time the market is in session though not 24/hours). So about 3 million data points in total. However my algo doesn't just run through it one time either, it has to copy it to 24 separate processes first and each of the 24 processes need to process the 3 million points (all the stocks, not just one for each process) and then a final process combines the 3 million or so output from the underlying processes to combine it back into a final 3 million dataset.
processing one of the 24 streams for one of the 25 stocks takes about 20 seconds using 24 out of 32 of my cores. I'll know shortly how this much heavier processing load takes to finish.
@freemo
Can't say it's good or bad until I know what would the real world scenario of operations?
@mur2501 What do you mean?
@freemo
How much data will it be handling in the real deal?