(The notes for Energy-based Models and Boltzmann Machines are not included here. I will try to add it sometime in the future.) Assumption: Computational machinery necessary to express complex behaviors requires highly varying mathematical functions (eg, highly non-linear functions). Two things discussed: Depth of architecture Locality of estimators what matters for generalization is not dimensionality, … More Reading notes: Learning Deep Architectures for AI (by Yoshua Bengio 2009)
“Multilayer feedforward networks are universal approximators” (1989) https://pdfs.semanticscholar.org/f22f/6972e66bdd2e769fa64b0df0a13063c0c101.pdf Standard multilayer feedforward networks with as few as one hidden layer using arbitrary squashing functions are capable of approximating any Borel measurable function from one finite dimensional space to another to any desired degree of accuracy, provided sufficiently many hidden units are available. In this sense, multilayer feedforward … More Multilayer feedforward networks are universal approximators
http://www.loebner.net/Prizef/TuringArticle.html Three questions: “Can machines think?” “Are there imaginable digital computers which would do well in the imitation game?” “Let us fix our attention on one particular digital computer C. Is it true that by modifying this computer to have an adequate storage, suitably increasing its speed of action, and providing it with an appropriate … More Reading Notes: Computing machinery and intelligence by A.M. Turing
For a moment in the 1980s, it seemed that knowledge engineering was about to take over the world. Marvin Minsky, an MIT professor and AI pioneer, is skeptical of any unifying ideas in AI. Cyc project is the most notorious failure in the history of AI. Fred Jelinek once said: every time I fire a … More Reading Notes: The Master Algorithm (Part 2)
“The Master Algorithm” by Dr. Pedro Domingos is a nice book. I enjoyed reading it. Learners program themselves. Learning algorithms are artifacts that design other artifacts. Hundreds of new learning algorithms are invented every year, but they are all based on the same few basic ideas. Some key questions: How do we learn? Is there … More Reading Notes: The Master Algorithm (Part 1)
Reading Note: Realtime Machine Learning the Missing Pieces Context: ML has predominantly focused on training and serving predictions based on static models Supervised learning paradigm Static models are trained on offline data There is a strong shift toward tight integration of ML models in feedback loops. Broader paradigm (RL) Applications may operate in real environments … More Reading Note: Realtime Machine Learning the Missing Pieces
This is my reading note for the paper titled “Machine Learning: the hight-interest credit card of technical debt”.
Machine learning is powerful toolkit to build complex systems quickly, but these quick wins does not come for free.
Dilemma: speed of execution and quality of engineering. … More Reading notes: Machine Learning: the high-interest credit card of technical debt