Iteration Dogma: Difference between revisions
From Rest of What I Know
Add popper |
Added page description via AutoDescriptor bot |
||
| Line 12: | Line 12: | ||
The iteration dogma is a bad idea to employ in fields where global convergence are not possible. e.g. finding the roots of a polynomial employing [[wikipedia:Newton's method|Newton-Raphson]] will result in slow convergence if in the wrong place. | The iteration dogma is a bad idea to employ in fields where global convergence are not possible. e.g. finding the roots of a polynomial employing [[wikipedia:Newton's method|Newton-Raphson]] will result in slow convergence if in the wrong place. | ||
{{#seo:|description=The Iteration Dogma is the belief that adapting to evidence is superior to picking the right prior.}} | |||
[[Category:Concepts]] | [[Category:Concepts]] | ||
Latest revision as of 23:30, 27 January 2026
The Iteration Dogma is the belief that adapting to evidence is far superior to picking the right prior. It is commonly employed in the startup-mindset of searching for Product-market fit where it is considered boringly true.
Examples[edit]
- Searching for product-market fit is the primary example.
- Bayesian updating under prior-support, distinguishability of hypotheses (particularly truth from alternatives in the KL divergence sense)
- Popper-style empirical science with falsifiable conjectures formed which are iteratively refined through targeted refutation
Where It Does Not Work[edit]
The iteration dogma is a bad idea to employ in fields where global convergence are not possible. e.g. finding the roots of a polynomial employing Newton-Raphson will result in slow convergence if in the wrong place.
