Open AI about to preview their latest model - O1. It's over for almost everyone.

you don't have any idea what u just wrote and copied half of it from reddit

not a single molecule
Only 10% of what software engineer does is coding. People still have trouble understanding this.
 
  • +1
Reactions: diditeverbegin and User28823
you don't have any idea what u just wrote and copied half of it from reddit

not a single molecule
you believe that, if that makes you feel better. i have been following the space closely since gpt 3 was released.

in the meantime, I'm going to keep integrating RAG tools, AI coding assistants and agents into my workflow.

narrow-minded tards like urself going to get left behind with that dismissive mindset.
 
  • JFL
  • +1
Reactions: givemeheight and User28823
you believe that, if that makes you feel better. i have been following the space closely since gpt 3 was released.

in the meantime, I'm going to keep integrating RAG tools, AI coding assistants and agents into my workflow.

narrow-minded tards like urself going to get left behind with that dismissive mindset.
mfw when i have the latest privatized generative AI tools at my finger tips at an enterprise level at my workplace while u fiddle with chatgpt

 
Only 10% of what software engineer does is coding. People still have trouble understanding this.
Depends on what kind of software engineering job we're talking about. Also, 10% maybe the case for senior devs, but junior devs are coding a lot more than 10% on average.

Anyways, I believe that within the next 5 years, a hive of agents will be able to outperform humans on every step of the software development life cycle. Whether that be planning, risk analysis, cost estimation or communication with clients.

When I said mass unemployment was coming, it doesn't necessarily mean 100% in the short term. It's just a case of reducing headcount massively. Why bother hiring junior devs, when senior devs could leverage AI and do the job of 10 juniors on his own?
 
Depends on what kind of software engineering job we're talking about. Also, 10% maybe the case for senior devs, but junior devs are coding a lot more than 10% on average.

Anyways, I believe that within the next 5 years, a hive of agents will be able to outperform humans on every step of the software development life cycle. Whether that be planning, risk analysis, cost estimation or communication with clients.

When I said mass unemployment was coming, it doesn't necessarily mean 100% in the short term. It's just a case of reducing headcount massively. Why bother hiring junior devs, when senior devs could leverage AI and do the job of 10 juniors on his own?
What you are talking about is AI running an entire company which just can't happen. AI can only be used in specific roles in Software development. There's also something called code quality that many companies these days are taking into account. A human coder will always have better code quality than AI because of out of the box thinking, years of experience. A software developer also has more dynamic roles in a company than just one specific task that AI can replace. Like @User28823 said most companies already have these tools at their disposal but still need AI coders to fix the code before deployment.
 
  • +1
Reactions: onlyhereforthehair
mfw when i have the latest privatized generative AI tools at my finger tips at an enterprise level at my workplace while u fiddle with chatgpt

View attachment 3191709
Okay, good for you.

I don't know where you work, so can't make any proper insights on the model(s) you're using.

If it's proprietary from a small company, then it's probably a GPT wrapper. If it's an enterprise solution from a company like Microsoft, then it's probably a fine-tuned version of GPT lol. The difference between commercial and enterprise is marginal at best atm.
 
What you are talking about is AI running an entire company which just can't happen. AI can only be used in specific roles in Software development. There's also something called code quality that many companies these days are taking into account. A human coder will always have better code quality than AI because of out of the box thinking, years of experience. A software developer also has more dynamic roles in a company than just one specific task that AI can replace. Like @User28823 said most companies already have these tools at their disposal but still need AI coders to fix the code before deployment.
"What you are talking about is AI running an entire company which just can't happen. AI can only be used in specific roles in Software development."

Not what I'm saying in the short term. I'm saying that companies will have significantly reduced headcounts. Senior devs will act as quality control for the output produced by AI. There will be no need for junior devs.

"There's also something called code quality that many companies these days are taking into account"

And you're assuming that the code produced by AI will always be sub-optimal. Hallucinations will be a thing of the past and the quality of code will reach a point where the opportunity cost of training up a junior dev won't be worth it. The models will get better. The reason I made this thread was that the RL techniques of o1 leveraged with a model trained on 10x the compute of GPT-4o I believe will take us to the fulcrum where a lot of AI-produced code will reach the quality where it's ready to commit or merge PR without any refactoring needed.

"A software developer also has more dynamic roles in a company than just one specific task that AI can replace"

I believe we will have specialised agents capable of replicating any role of a software dev, not just the coding aspect. For the record, I think software dev is one of the hardest jobs to automate, so the fact it could be automated would mean most white-collar work is cooked.

"A human coder will always have better code quality than AI because of out of the box thinking".

This I think is hubris. A lot of human "ingenuity" is retrieving existing knowledge and recombining it in novel ways. LLMs can do this but 1,000,000x faster, and with significantly larger training data to pull from than the human brain.
 
  • +1
Reactions: givemeheight and Jason Voorhees
automating lawyers would mean they'd have to automate the 1000s year old untouched justice system. i dont think that will ever happen in our lifetime. maybe the assistants and irrelevant positions linked to lawyers might get eliminated, but lawyers are here to stay ig
Very easy for ai in near future
 
  • +1
Reactions: FBl
Can’t wait to use this for homework and looksmaxxing questions
 
no change, it's still just gonna do low iq tasks, like predicted, which is good, there's no singularity jfl
not yet. but if openAI sticks to their time schedule, it should be ready in 5years. gpt 5 will probably be an agent, which is the 3rd level of AGI. the next steps after that will be innovators and then organizators, both will rely on multimodal learnings, which they are already training right now
Cope. I've been hearing shit like this ever since ChatGPT launched.
ChatGPT really started to get hype in the beginning of 2023. GPT 3.5 released in the end of 22. That is still a very short amount of time. Even the majority of tech optimists weren't predicting to have reasoners in the year of 2024 already. That is HUGE. Think about what will happen in 5-10 years.
A human coder will always have better code quality than AI because of out of the box thinking
This shit is brought up everytime AI is discussed. Most people assumed it wouldn't even be possible for LLMs to pass law exams, and then many people were sure that AI isn't able to create new songs or art from scratch. All that got disproven pretty much in the last year.
AI has the potential to do EVERYTHING better and faster than humans in the future. The only question left is when.
 
  • +1
Reactions: omnilegent, EverythingMattersCel and NZb6Air
not yet. but if openAI sticks to their time schedule, it should be ready in 5years. gpt 5 will probably be an agent, which is the 3rd level of AGI. the next steps after that will be innovators and then organizators, both will rely on multimodal learnings, which they are already training right now

ChatGPT really started to get hype in the beginning of 2023. GPT 3.5 released in the end of 22. That is still a very short amount of time. Even the majority of tech optimists weren't predicting to have reasoners in the year of 2024 already. That is HUGE. Think about what will happen in 5-10 years.

This shit is brought up everytime AI is discussed. Most people assumed it wouldn't even be possible for LLMs to pass law exams, and then many people were sure that AI isn't able to create new songs or art from scratch. All that got disproven pretty much in the last year.
AI has the potential to do EVERYTHING better and faster than humans in the future. The only question left is when.
Stop giving me more reasons to commit suicide nigga... :feelswhy:
 
use AI learning model to become the greatest sports better of all time
I used gpt 4 to predict scores for the euros, it got everything wrong other than 1 game. Muh.
 
  • JFL
Reactions: Kelly Oubre Jr
Bro what should I do I'm 16 how can I make it to the elites who'll own everything in next 50 years?
 

Users who are viewing this thread

Back
Top