What this work shows is that if you have the right input basic material (data) with the right distribution (here, a heterogeneous one across a bunch of robots), and then you train a high-capacity neural net on it, you get out something greater than the sum of its parts - a model with surprisingly good out-of-distribution generalization as a consequence of some critical reaction that occurs due to your combo of data + architecture + complexity. Sometimes I think that developing AI is more like a chemical process rather than a machining one.
Jack Clark | Import AI 343: Humanlike AI; LLaMa 2 protests; the NSA's new AI center
Filed under:
Related Notes
- AI should be considered intelligence that’s simply learned for a ve...from Ribbonfarm Studio
- The reason tasks like looking up part numbers and data sheets is ha...from Ribbonfarm Studio
- More things than you would think are dynamic strategic problems. If...from marcelo.rinesi
- To quote McLuhan: "Man becomes, as it were, the sex organs of ...from ycombinator.com
- It is funny to imagine an end state here in which markets are entir...from Matt Levine
- As these stories pop up people act like they’re an incredible marve...from Garbage Day
- But you can already see the idea of a “prompt” evolving into someth...from Ryan Broderick
- the tech am I digging recently is a software framework called LangC...from Interconnected