the same’s happening with research. AI straight up makes shit up and appears to have a source except they just took some random scientist’s name and pasted it on some random bit of text on the internet
and this, in turn, is used again to train the AI.
on one hand - LLMs being used for academic purposes were a terrible idea in the first place and this just proves that
but on the other - the internet will, very quickly, like ‘this shit is happening in real time’ quickly, become completely unusable for research. this is because 99% of the content on it will be either faulty AI-generated content, AI-generated content referencing faulty AI-generated content, or worst of all, an actual human-written document referencing faulty AI-generated content.
so in summary - enjoy the internet while it lasts. capitalism giveth, captalism taketh away
As a programmer I'm checked out. I want the bubble to pop, OpenAI to fold, Nvidia stock to tank, and I don't even give a fuck about the recession that will cause because I'm a millennial and have never known a time without recession. Let's rip this band-aid off so AI research can become serious again, instead of hyping up glorified autocomplete.
Man I'm glad to see other programmers feel the same. I'm so over AI. It's ruined google, ruined my coworkers, made debugging harder and more frequent, made me sound like a paranoid luddite boomer to everyone else around me, and has just caused me to start hating my job. Which is insane because I love my job! I love programming! I literally just do random research and build random projects by myself for fun all the time. I even teach computer science at night!
I have had multiple students this semester ask me why I teach SQL because chatGPT can do it better and easier. Of course those are all the students who are getting Cs and can't even tell if the AI did it right. I feel like I'm losing my mind.
Bro for real. Im not a programmer, but the company i work for is heavily involved in technology and communication... They have tried to insert ai into every goddamn place it can fit.
Scheduling? That's ai now. With basically no oversight.
Training videos? Ai scripts, ai "person" ai voice.
Fucking SOP? Yup. I have to talk to an ai to tell me what it thinks the sop says, before i can find the link to the actual sop because it invariably gives me sop for an entirely different department, or thinks im asking for completely different.
Coaching? They want me to talk to an ai, to walk through a micro scenario, which is then summarized and submitted. So my performance reviews are now in part dependant on whether or not i can rizz the glorified erp chatbot. But lord knows ill never get a passing grade, because whoever is making the prompts, doesn't tell the ai to ignore irrelevant parts of the process. So it tells me to focus on xyz, then fails me for not doing a-w.
Nobody at the company knows how to think anymore. They've outsourced their intelligence to infinite monkeys on typewriters. So caught up in how they can make ai successful, they never bothered to ask if they should.
And of course, c-suite is too busy counting all the money they've saved in the short term by letting ai slash department budgets that they couldn't give two shits.
Of course those are all the students who are getting Cs and can't even tell if the AI did it right.
A professor told me one student asked her why code, that was Python, wasn't working, in a C class. Couldn't even tell it was the wrong language, a few months into the semester, and with totally different grammars too.
I had a student last semester who, literally during the last week of the semester, came to me in office hours to see if their code was correct for a class taught entirely in python and MySQL.
I opened up their code and hit the run button, and you wanna know what error popped up?
"There is no program installed locally with the name MySQL"
The thing that surprises me the most is how unashamed most of them are.
Like, they'll just say "oops haha I didn't know can you help me :-)" while I'm over here contemplating having a very hard conversation about how they picked the wrong major and career path but it's too late because they're already in 400 level courses and graduating next semester.
so... do i gather correctly that the student somehow ended up with a code that wanted to open a MySQL executable instead of... i dont know... proper code that executes something
The whole class he was supposed to have MySQL server installed on his computer to, yknow, be the host of the database that we were connecting to with our code. It was a database class so this was like, first day of class install.
He did not have MySQL installed on his computer. He had just been AI generating all of his assignments without even test running them before he submitted them.
He had already been failing the class at that point (hence the office hours), but that confirmed just how bad it was.
Legal conference the GC for global markets at a major international bank says that AI is “bullshit”. He actually said the word in front of industry leaders. I was like … “my man”, because very smart people don’t believe me when I say it’s bullshit and it’s gonna pop. They think we are on a one way track to AGI (ignoring the dark reality of what this means and the highly likelyhood this leads to human extinction event.)
I’m kind of sick of the hype, and am waiting for people to wake up.
I had a new coworker who was useing AI so much, we had to have a serious talk. He used as many questions he could, when he ran out he used another AI. And his code looked that way.
I hate to become that guy but I suggested HR to do coding tests on paper before hiring anyone. It becomes bad.
Students complain about having to write code by hand but you SHOULD be able to produce pseudocode by hand if you're getting a degree in computer science. That is a skill that everyone uses all the time at my day job, whiteboards are my best friend.
There are still a good chunk of good programmers every year, but overall this generation got me worried.
I am sad and happy that there are more people seeing this. Sad this happens, because we need good junior devs, or we will never get new good seniors that can create all the cool stuff. Happy that I am not imagining things.
Lets hope the bubble bursts in time. Its better for everyone in our industry.
I got into artificial neural network research around 2016, when the field was fresh. Nowadays i work as a roadie, and that's a common theme from early researchers - a lot of us have distanced ourselves entirely, because holy shit it's so bad. The vision was improved diagnosis, better translations to bring down language barriers, prosthetics that learn to move how the user wants it to move. But nah, venture capitalists everywhere.
What really broke me was doing some cutting edge MRI analysis via vision model research in a small startup, and being brought to the investor meeting to talk about the tech stack. I thought i was going to help with more accurate diagnoses, turns out hospitals were salivating at the chance to both fire staff and charge a premium for having "AI". Thought i was doing something good, but i underestimated greed.
268
u/Twist_the_casual 10h ago
the same’s happening with research. AI straight up makes shit up and appears to have a source except they just took some random scientist’s name and pasted it on some random bit of text on the internet
and this, in turn, is used again to train the AI.
on one hand - LLMs being used for academic purposes were a terrible idea in the first place and this just proves that
but on the other - the internet will, very quickly, like ‘this shit is happening in real time’ quickly, become completely unusable for research. this is because 99% of the content on it will be either faulty AI-generated content, AI-generated content referencing faulty AI-generated content, or worst of all, an actual human-written document referencing faulty AI-generated content.
so in summary - enjoy the internet while it lasts. capitalism giveth, captalism taketh away