Software engineering is the worst choice you can make

It doesn't matter how niche something is, an Ai will be able to read all of the documentation, and all of the source code, and experiment until it's figured out how to use whatever niche code is being used, faster than a human.
Also it seems as if you do not seem to understand how machine learning works. The "reading" and "source code" all will be apart of the training data and used as context, yes, but so will a much larger amount of data. Although one may highlight this context, the data as a whole will be weighted to give an outcome. Hence why you may have "hallucinations" to even the simplest commands.

Although chances of hallucinating on a straight forward answer given by documentation has a very low chance of occurring, imagine this on a scale of a project requiring many systems and many files of code all interacting to form a solution - the chances are much higher that something will be off. Now, as a developer, if you are referencing these models for the simplest of solutions, how do you expect to solve or even prompt these models to solve these issues? This is just one example where your scenario falls off, much less in situations where documentation is scarce and hard to follow (which is not a rare occurrence).
 
What outlooks? How did they calculate their predictions?

I don't think you've looked into this at all.


We are on track to create smarter than human intelligence. This is intelligence that can solve any problem that humans can, faster than humans can.



That's the paradox of automation. You can read about it, if you look up "paradox of automation".

It stops applying when a field becomes fully automated. Elevator lift operators, Pin Boys, Data Entry Clerks, etc.

It's not a matter of "IF" but "When" Programming will be fully automated. If you're open to discussing the "When" we can do that, but you should first admit that programming is going to be fully automated.

People who write software don't care who or what creates the software, as long as it's bug free, does what it's suppose to do, and is written in a way that it's maintainable. Current Ai can't do this as well as humans, but future Ai will be able to.
Can refer to my previous post as a response for most of this. Also, as for these outlooks and predictions, do me a favor and read the latest articles in any renown AI/ML journal regarding the state of AI use for coding or similar tasks.

Brave statement saying I have not looked into this, I work heavily in research in a quantitative field and study machine learning with work on published papers regarding use of multi modal models for quantitative finance use (similar to coding in the sense of constraints). I have been around models and agents extensively in industry at some of the top companies that you have heard of, and have colleagues at many others.
 
Also it seems as if you do not seem to understand how machine learning works.
I guarantee I have a better understanding of architectures for artificial general intelligence than you do. AGI architectures is what this entire field is moving towards.

The "reading" and "source code" all will be apart of the training data and used as context,
That's not how the latest models work. They can learn in real time, after they are trained. They also have memory. Meaning they can read something and remember it.

Agents will be able to conduct experiments, and learn from those experiments.

yes, but so will a much larger amount of data. Although one may highlight this context, the data as a whole will be weighted to give an outcome. Hence why you may have "hallucinations" to even the simplest commands.

Although chances of hallucinating on a straight forward answer given by documentation has a very low chance of occurring, imagine this on a scale of a project requiring many systems and many files of code all interacting to form a solution - the chances are much higher that something will be off. Now, as a developer, if you are referencing these models for the simplest of solutions, how do you expect to solve or even prompt these models to solve these issues? This is just one example where your scenario falls off, much less in situations where documentation is scarce and hard to follow (which is not a rare occurrence).
You're talking about current public models. I'm discussing where the field is moving towards.

The latest models already learn from their own thinking. They think for a while, and then update their weights based on what they've learned from their own thoughts.

The next step for agents, is agents that conduct experiments and learn from them. This means it "plays with an api" like a human would. It will feed different data into the functions, and remember the results. Important information from short term memory will go into long term memory (update the weights), and improve the agents intuition, so it takes less experiments to figure out future Apis.

Can refer to my previous post as a response for most of this. Also, as for these outlooks and predictions, do me a favor and read the latest articles in any renown AI/ML journal regarding the state of AI use for coding or similar tasks.
This is not a prediction of how Ai will evolve, only an assessment of it's current capabilities.

Brave statement saying I have not looked into this, I work heavily in research in a quantitative field and study machine learning with work on published papers regarding use of multi modal models for quantitative finance use (similar to coding in the sense of constraints). I have been around models and agents extensively in industry at some of the top companies that you have heard of, and have colleagues at many others.
Then you have a very limited understanding of Ai.

Current Algorithms and models exist primarily to bootstrap future more powerful algorithms and models. These models will always be running experiments, always be learning from the results of those experiments, always thinking, and always learning from the results of those thoughts.

Yes, there are some use cases that you and your colleagues are playing with, but these models are primarily stepping stones to agents with memory that learn and think.
 
Last edited:
You're wrong though. Although majority of simple web dev jobs like creating a page and such can be done using mostly LLMs, there are many other things to consider as you become a more senior or specialized engineer.

Distributed systems, identification of latency pipeline blockages, algorithnmic considerations and constraints just to name a few, and these will never be able to be solved fully by LLMs as many times you require very unique solutions that are just too niche to be suggested by an LLM or AI unless you basically prompt the solution.

Even in basic CRUD work at the top companies, many of the infrastructure is internal and constantly changing, and because of this even the in house LLMs struggle to keep up with useful solutions for some mediocre tasks.

JR devs will always be needed as they must be trained with the knowledge on the systems and workflow to bring them to SR.

Jobs are not going anywhere.
same argument was made around a year ago to me
what you are expecting is something to run a whole company i thought your whole argument was engineers getting replaced by ai, two completely different things. It does have contextual understanding claude 2.1 literally demonstrated it with 200k tokens althought it was no where near as consistent as chatgpt.

Then why don't you enlighten me on how either of those can resolve a zero-day vulnerability in a timely manner instead of providing another ad hominem, you're running out of those.
it wont, companies have devops lifecycles to do this, this should always be automated of course companies with ancient infra need it, you can just use chatgpt to help migrate an entire infrastructure just as you were saying "those models still lack contextual understanding and are incapable of dynamically learning from ongoing discussions the same way humans can". i've literally used chatgpt to migrate from manual deployments to be more scalable with kubernetes and helm.

Someone will be always on call but its just how many of them will be replaced in the next 3-5 years no one knows. I don't see it "furthering your point"
i mean when over like 75% of them believe in possible replacement, i wanna hear them out, the companies that produce these chatbots believe large replacements will happen (specifically altman) anthropic was birthed out of safety of ai. The people in blind aren't as delusional as the people on linkedin most of them are shit posting but a lot of them are in prestigious companies because they are the top engineers
also theres dozens of post on blind, shit ton of senior engineers talking about being replaced, why do you think they say this despite being some the most competitive candidates


GPTs are suppose to a step towards this if your company has ever tried it out would know, never said jobs are entirely gone I'm just saying we are already seeing that crunch and its not getting a degree = getting a job like it was between 2014-2020, there is literally no reason for meta, microsoft and google all their largest % layoffs soon based on "performance". Go look at any of faangs internal blind channels why are they freaking out so much about it

in conclusion no i don't think jobs are disappearing just yet still hold the same sentiment as i did a year ago im not saying this for my self i have done like 5 internships at notable companies just a warning for someone 4 years later going into swe to look out for. Job transforming to use llms may not necessarily make more jobs
 
  • +1
Reactions: Deleted member 15778
Also it seems as if you do not seem to understand how machine learning works. The "reading" and "source code" all will be apart of the training data and used as context, yes, but so will a much larger amount of data. Although one may highlight this context, the data as a whole will be weighted to give an outcome. Hence why you may have "hallucinations" to even the simplest commands.

Although chances of hallucinating on a straight forward answer given by documentation has a very low chance of occurring, imagine this on a scale of a project requiring many systems and many files of code all interacting to form a solution - the chances are much higher that something will be off. Now, as a developer, if you are referencing these models for the simplest of solutions, how do you expect to solve or even prompt these models to solve these issues? This is just one example where your scenario falls off, much less in situations where documentation is scarce and hard to follow (which is not a rare occurrence).
Can refer to my previous post as a response for most of this. Also, as for these outlooks and predictions, do me a favor and read the latest articles in any renown AI/ML journal regarding the state of AI use for coding or similar tasks.

Brave statement saying I have not looked into this, I work heavily in research in a quantitative field and study machine learning with work on published papers regarding use of multi modal models for quantitative finance use (similar to coding in the sense of constraints). I have been around models and agents extensively in industry at some of the top companies that you have heard of, and have colleagues at many others.
i mean i sure hope your justified i assume you have atleast done phd level+ research into this if so i shouldn't be talking, but being a data scientist or machine learning engineer at a faang is far away different from an actual researcher. For example even yann lecun has changed his opinions multiple times in the past months why has his opinion changed so much to the point he's just reiterating on amodei and altman
 
Indians will take over

All the white incel programmers will be replaced by Rajeet who earns 30% their salary but works longer hours
 
  • +1
  • JFL
Reactions: hopecel and Deleted member 15778
What a cope thread tbh. SE has always been a meme. The endgame was always to build your own thing or to pivot to teaching. Or get lucky to inherit some niche garbage to maintain.
 
  • +1
Reactions: emdeplam and Deleted member 15778
What a cope thread tbh. SE has always been a meme. The endgame was always to build your own thing or to pivot to teaching. Or get lucky to inherit some niche garbage to maintain.
That's such a poorly skilled dev type of mindset
As an employee or freelance you can work on amazing projects and earn tons of cash and recognition
Building your own thing is cool yes, except if it's just another sketchy saas no one will use
Waiting to inherit legacy code to maintain is just a fat lazy fuck thing to hope for
Teaching is onl for the most skilled people or the biggest losers who didn't succeed in the field.
 
  • JFL
Reactions: glaruz
Just wanted to bump this thread after Zuck and Google started using AI code in their own codebase
 
Unnatural incel field tbh, chads are in Medicine or working some meme tier lowly jobs
 
  • +1
Reactions: silently_said
Unnatural incel field tbh, chads are in Medicine or working some meme tier lowly jobs
True, very few chad in the field but there are some.
They either end up in lead / management position or building startups but rarely remain just regular devs
 
That's such a poorly skilled dev type of mindset
As an employee or freelance you can work on amazing projects and earn tons of cash and recognition
Building your own thing is cool yes, except if it's just another sketchy saas no one will use
Waiting to inherit legacy code to maintain is just a fat lazy fuck thing to hope for
Teaching is onl for the most skilled people or the biggest losers who didn't succeed in the field.
You are right, but all of these things are not a good thing either way.

Idgaf about being "skilled" or "useful". I care only about money and having fun. Amazing projects and recognition are slave copes.

Inhering legacy code being "fat lazy thing to hope for". Yes, exactly! I want to put minimal effort and get paid.

For teaching, I admire the so called "losers". They put in less effort and still get paid.
 
So I basically if you are into CS and not top IQ it's over.
I graduated from one of the best CS school of a first world country and now I'm in charge of recruiting devs.

What I'm witnessing is quite sad.
Today I interviewed a dude for an internship and he told me "I picked this school because there are endless opportunities of work for devs."
He couldn't be further away from the truth, here is why:

- The penury of devs led to massive investment in creating dev schools, people are massively being recruited into thoses schools because they think the penury is still here and they will immediately have a job when they get their diplomas.
- This increased amount of student leds to massive internship application thus creating more cheap work for companies, there are some startups that run almost exclusively with internees and other underpaid devs.
- Third world countries can work twice the time for half the amount of money as westerners. These guys are hungry to succeed I've seen several people losing their jobs to companies relocating to places like India, some of them were boomers at the end of their careers so they didn't care, some weren't and were very upset with the situation
- AI is going to remove a lot of needs for devs, just like wordpress allowed random people to create a site or shopify to create online stores, AIs are going to allow less knowledgeable people to build more complex tools
- Salaries are decreasing, when I see the salaries I was offered a while ago and the salaries now, they either stagnated and got wrecked by inflation of even got smaller. Except for highly qualified senior profils
- I honestly think that the majority of devs are suffering depression and are heavily affected by stress, I think this is one of the most damaging profession when it comes to mental health, this was the case when the profession was well payed and had endless job opportunities, imagine now with a job crisis and AI, so many people are going to be instant doomed.

I'm personally doing well but don't think dev is a viable option for a good life. I think I was more socially and mentally fulfilled with simpler jobs like bartending.
There are a lot of jobs that may pay less but won't get you losing your hair sitting all day long at your desk worrying if an AI chad is going to steal your job and bang your Becky

Be safe out there
Yep I already know this, I have masters and degree in cyber security and computer science. Can’t get a job


I am personally suffering it myself it’s a dead field that’s why I now do project management.

And it’s not just software engineering it’s also Cyber sex it’s impossible to break through now
 
  • +1
Reactions: silently_said
You are right, but all of these things are not a good thing either way.

Idgaf about being "skilled" or "useful". I care only about money and having fun. Amazing projects and recognition are slave copes.

Inhering legacy code being "fat lazy thing to hope for". Yes, exactly! I want to put minimal effort and get paid.

For teaching, I admire the so called "losers". They put in less effort and still get paid.
I mean if you are ok having a shitty salary and have no ambitions good for you
Most people need to get something more than just easy money from what they do on a daily basis, I couldn't imagine myself having the same boring job maintaining legacy crap for years


Yep I already know this, I have masters and degree in cyber security and computer science. Can’t get a job


I am personally suffering it myself it’s a dead field that’s why I now do project management.

And it’s not just software engineering it’s also Cyber sex it’s impossible to break through now
Project management works too, but it can be a life sucking experience depending on where you work
 
I mean if you are ok having a shitty salary and have no ambitions good for you
Most people need to get something more than just easy money from what they do on a daily basis, I couldn't imagine myself having the same boring job maintaining legacy crap for years



Project management works too, but it can be a life sucking experience depending on where you work
PM is far more reliable though. Doing tech is a joke and oversaturated unless your top 10% in the field you’ll never get a job 🤷‍♂️
 
It is official, it is now cheaper to use a llm than to hire a junior dev
A LLM can be assigned basic tickets and work autonomously on it, recursively fixing its own errors, then commit the code so a senior can review it.
Tbh some companies might accumulate technical debt because they will make junior / intermediate dev vibe code all the way.
Then senior devs will be needed to untangle all the shit

Junior devs are dead, senior devs will be kings.
Market is crazy hard for newbies rn
 
Well,if u ask me ai bots slowly but surely will replace devs with em,i can swe the early destruction with my own eyes for instance my cousin tried to build ai bot to cater and manage his clients request by asking chatgpt with zero knowldege of any programming language,if this as the case why cant a company do it had sliding on phone screen for quite 25 mins
 
Well,if u ask me ai bots slowly but surely will replace devs with em,i can swe the early destruction with my own eyes for instance my cousin tried to build ai bot to cater and manage his clients request by asking chatgpt with zero knowldege of any programming language,if this as the case why cant a company do it had sliding on phone screen for quite 25 mins
tbh building a whole project as a non technical person will only lead to poor service, people don't understand how complex it is to go from a side project coded with ai in a few days to a fully production ready application
I forecast that in the coming years a lot of Saas will appear, with very low product quality because of this. That's where those companies will have to recruit senior devs to handle the project and patch all the un-finished work, if they don't want their customers to go away
 
tbh building a whole project as a non technical person will only lead to poor service, people don't understand how complex it is to go from a side project coded with ai in a few days to a fully production ready application
I forecast that in the coming years a lot of Saas will appear, with very low product quality because of this. That's where those companies will have to recruit senior devs to handle the project and patch all the un-finished work, if they don't want their customers to go away
What about careers as an ML engineer? Starting salaries seem to be at least double software dev/ engineering positions right now.
Software development is a fucking joke.
 
Last edited:
  • +1
Reactions: Brownskink
What about careers as an ML engineer? Starting salaries seem to be at least double software dev/ engineering positions right now.
Software development is a fucking joke.
I guess its ok until the bubble pops, long term i don't know.
But it is certainly a good skillset to have
 
  • +1
Reactions: RecessedCel5

Similar threads

key
Replies
19
Views
1K
ndheightpracepill
ndheightpracepill
itsherlossNVM
Replies
24
Views
752
Oatriced
Oatriced
inversions
Replies
45
Views
2K
heartonfire
heartonfire
Zeekie
Replies
15
Views
801
Boyoshutup
Boyoshutup

Users who are viewing this thread

Back
Top