Asymptotics In Statistical Inference
See the rant in Tumblelog.
Also: “Informally, statistical consistency says that, as we get more and more data, we converge more and more closely — and, ultimately, arbitrarily closely — to the true underlying model.” But immedately after this there’s a very good explanation of what statistical consistency really does say, finishing with “more data will ultimately take us to the correct answer” (my emphasis).
Can tie this in with internal realism — especially Brian Ellis’s version, and maybe also Nick Jardine’s.