Some simple algorithms can be done so. For example, simple OLS regression, which relies on solving the normal equation is a typical case. Now suppose you have an extremely large table, say 10^15 recoreds and 500 variables, you can construct the matrix of X and Y corresponding to normal equation while you read in the data sequentially because the normal equation system ends up with only summation and summation of cross products. After passing all records through, you obtain enough information of sufficient statistics and what you need to do is simply to apply a sweep operator to the 502-by-502 normal equation matrix.
Of course, since the data is so huge that you will have to distribute it over thousands of computers and process the summation in parallel.
The so-called Stream Algorithm, originally designed for extremely long tables, also looks promising to solve your puzzles. Under this algorithm, you may not even need to read in all data in order to obtain a strongly consistent result. But it needs custom design per specific case exploiting special stochastic characteristics of the data and also depends what you want to do. There is no universal solution.
1# 爱萌