Em Adespoton

  • 0 Posts
  • 274 Comments
Joined 2 years ago
cake
Cake day: June 4th, 2023

help-circle



  • Supercomputers once required large power plants to operate, and now we carry around computing devices in out pockets that are more powerful than those supercomputers.

    There’s plenty of room to further shrink the computers, simplify the training sets, formalize and optimize the training algorithms, and add optimized layers to the AI compute systems and the I/O systems.

    But at the end of the day, you can either simplify or throw lots of energy at a system when training.

    Just look at how much time and energy goes into training a child… and it’s using a training system that’s been optimized over hundreds of thousands of years (and is still being tweaked).

    AI as we see it today (as far as generative AI goes) is much simpler, just setting up and executing probability sieves with a fancy instruction parser to feed it its inputs. But it is using hardware that’s barely optimized at all for the task, and the task is far from the least optimal way to process data to determine an output.














  • Magnets that power these facilities are imperfect, and even tiny fluctuations in magnetism can cause resonance.

    Thing is, it’s not even limited to the magnets. As the energies go up, things like global gravitational flux and even changes in mass near the accelerator will affect the particle’s path. If this happens over time in line with a resonant frequency, you can get the double bounce effect.

    I wasn’t able to tell how much of this they’re currently accounting for with their model, or if they are only factoring in the known magnetic imperfections.