LOL the algo is up to consuming 20 gigs right now on this stress test. Good thing I have 64 gigs of high speed memory :)

@mur2501 not particularly large, 100,000 data points per stock (1 min price data for one year across 24 stocks, only for the time the market is in session though not 24/hours). So about 3 million data points in total. However my algo doesn't just run through it one time either, it has to copy it to 24 separate processes first and each of the 24 processes need to process the 3 million points (all the stocks, not just one for each process) and then a final process combines the 3 million or so output from the underlying processes to combine it back into a final 3 million dataset.

processing one of the 24 streams for one of the 25 stocks takes about 20 seconds using 24 out of 32 of my cores. I'll know shortly how this much heavier processing load takes to finish.

@freemo
Can't say it's good or bad until I know what would the real world scenario of operations?

@freemo
How much data will it be handling in the real deal?

Follow

@mur2501 Well this is the amount it will deal with when im doing development, not just as a stress test. This scenario is what I do in order to calculate the algorithm across 25 different stocks to see how well it performs. Then i use that data to tweak the algorithm to try to improve it and run it again. So this is a normal use case for the algorithm during analysis and planning phase.

As for when it is running in the wild and actually buying and selling stock, it would run in the current scenario once a day before market open to pre-establish back-data. Then once every minute it would ingest one additional datapoint per stock.

@freemo
Hope you soon become the wolf of the wall street :ablobcatbongo:

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.