In this note I summarize the insights gained from reading Extropics paper on their new computational technology, labelled as Thermodynamic Computation. Additional Resource: Extropic Founders Interviewed on Garry Tan’s Youtube Channel.

  • Probabilistic computing can connect directly to the AI at the system level via Energy-Based Models (EBMs) (A class of efficient deep learning models). Extermic leverages the use of Denoising Thermodynamic Models (DTM) which, similarly to Diffusion Models are much more capable than EBMs
  • existing devices have relied on exotic components such as magnetic tunnel junctions as sources of intense thermal noise for random-number generation (RNG)

Notes

  • Monolithic Energy-Based-Modeld (EBMs) use one unified big energy function (analogous to an error function) to determine the most probable state
  • Diffusion EBMs consist of much more modular denoising layers and are often more efficient than monolithic EBMs due to not needing to learn all information at once for minimizing the error function