A new paper suggests diminishing returns from larger and larger generative AI models. Dr Mike Pound discusses.

The Paper (No “Zero-Shot” Without Exponential Data): https://arxiv.org/abs/2404.04125

  • CheesyFox@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    there’s a lot to optimize in LLMs and i never said otherwise. Though, photonic computers if the field would be researched, could consume as much as an LED lamp making it even more effective than our brain. given the total amount of computers in the world, even the slightest power consumption optimization would save colossal amount of energy, and in case of photonics the raw numbers could possibly be unimagineable.

    Regarding research…

    I bet they simply will find a way to greatly simplify the mathematical apparatus of the neuron interaction. Matrix multiplication is kinda slow and there’s lots of it