您的位置:首页 > 大数据 > 人工智能

Batch Gradiant Descent: The Tradeoff Between Converge Threshold and Training Set Capacity

2015-01-17 10:45 447 查看
When convergence threshold (the value to test predefined in the problem to test when a converge happens) is set to be small enough, small number of training samples is enough for the learning program to solve for a good enough hypothisis. There is a tradeoff
effect here.

Or we can say when there is enough training samples, the convergence threshold can be set higher compared to otherwise, so computing time can be decreased.

When sample size is 10 and convergence threshold is set to be 0.000001 for J of theta, result :



When sample size is 10 and convergence threshold is set to be 0.0001 for J of theta, result :



When sample size is 100 and convergence threshold is set to be 0.0001 for J of theta, result :

内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: