I have a very large .csv file with a numerical matrix saved within. I need to calculate the mean of many selections of values in each row. (e.g. In each row, the mean of the values at index 1, 3, 52, 123; then, in the same line, values 2, 3, 12, 29, 67, etc...)

My file is HUGE, even in row length (a row has like, 8000+ items), so I need this to be fast. I know the indexes I need to average at the start of the computation, but don't have enough memory to load the whole file at once.

@MrHedmad seems like you’d be seeking in from a file. Presumably you can hold a whole row in memory after seeking. Can you make an index so you know which byte to seek to when you want row X? Im guessing .h5 format file could do it but probably don’t want that complexity.
@photocyte
Reading the file one line at a time is not an issue: I can hold a line, even a few hundred/thousand lines in memory at once. The issue - I think - is subsetting the resulting vector(s) of numbers.