Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

Post History

#1: Initial revision by user avatar seth.wagenman‭ · 2020-12-04T20:20:56Z (over 3 years ago)
Disentangling Machine Learning Theory with Cross-Validation
Does anyone see a link between machine learning's repeated epochs of training and the concept of cross-validation in linear modeling theory?

This article demonstrates what I perceive to be confusion about the validity of cross-validation combined with Bayesian optimization:

https://piotrekga.github.io/Pruned-Cross-Validation/

I am starting to believe cross-validation is actually a slower, less effective approximation of Bayesian inference.  That opinion is informed by this Biometrika article from earlier this year (although I do not fully agree with the theoretical framework of coherence and prefer a more [Jaynesian](https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-050j-information-and-entropy-spring-2008/inference/) approach, but unfortunately he has been dead for more than 20 years):

https://academic.oup.com/biomet/article/107/2/489/5715611