Model Collapse
A.I. draws information from past entries, past writing, past research...it's not like an A.I. program is DOING the research in a lab or writing a white paper using it's own 'mind' only. My friends thought I was nuts - mainly because I had a difficult time explaining my ideas. I finally gave up....but then last night I found this...and THIS is what I was trying to explain....
from Forbes... Model collapse, recently detailed in a Nature article by a team of researchers, is what happens when AI models are trained on data that includes content generated by earlier versions of themselves. Over time, this recursive process causes the models to drift further away from the original data distribution, losing the ability to accurately represent the world as it really is. Instead of improving, the AI starts to make mistakes that compound over generations, leading to outputs that are increasingly distorted and unreliable. This isn't just a technical issue for data scientists to worry about. If left unchecked, model collapse could have profound implications for businesses, technology, and our entire digital ecosystem. |
Not arguing against A.I. at all - just fascinated by the potential upside and downside. We know that A.I. has 'created' footnotes that don't exist..has created legal 'precedents' that don't exist...once created do they become part of the 'content' A.I. draws from in the future? Do we know?
Just one more thing I find interesting about Artificial Intelligence.
Ever lie awake worrying that you might be the only person who doesn't know what FOMO means?
Lightin' fuses is for blowin' stuff togethah.
Lightin' fuses is for blowin' stuff togethah.
Lightin' fuses is for blowin' stuff togethah.
⢠Chief Machine Learning Engineer @ ARIA Research (Sydney, AU)
⢠Lead GenAI SEO Campaign Engineer @ Kiteworks, Inc. (SF, US)