In my experience, tech employment is incredibly bimodal right now. Top candidates are commanding higher salaries than ever, but an "average" developer is going to have an extremely hard time finding a position.
Contrary to what many say, I don't think it's simple as seniors are getting hired and juniors aren't. Juniors are still getting hired because they're still way cheaper and they're just as capable as using AI as anyone. The people getting pushed out are the intermediates and seniors who aren't high performers.
I generally tend to interview every year to see what's out there in the world (sometimes I find something worth switching for, other times not). I'm not even looking very hard but have had 4 interviews in the last month.
Personally I think it's a bit more nuanced than senior vs junior (though it is very hard for juniors right now). What I've seen a lot of hunger for is people with a track record of getting their hands dirty and getting things solved. I'm very much a "builder" type dev that has more fun going from 0-v1 than maintaining and expanding scalable, large systems.
From the early start of the last tech boom through the post-pandemic hiring craze I increasingly saw demand for people who where in the latter category and fit nicely in a box. The ability to "do what you must to get this shipped" was less in demand. People cared much more about leetcode performance than an impressive portfolio.
Now reminds me a lot of 2008 in terms of the job market and what companies are looking for. 2008-2012 a strong portfolio of projects was the signal most people looked for. Back then being an OSS dev was a big plus (I found it not infrequently to be a liability in the last decade, better to study leetcode than actually build something).
Honestly, a lot of senior devs lose this ability over time. They get comfortable with the idea that as a very senior hire you don't have to do all that annoying stuff anymore. But the teams I see hiring are really focused on staying lean and getting engineers how are comfortable wearing multiple hats and working hard to get things shipped.
> I'm very much a "builder" type dev that has more fun going from 0-v1 than maintaining and expanding scalable, large systems.
Maintaining and expanding is more challenging, which is why I’ve grown to prefer that. Greenfield and then leaving is too easy, you don’t learn the actually valuable lessons. As experience shows that projects won’t stay in the nice greenfield world, building them can feel like doing illusory work — you know the real challenges are yet to come.
It is more challenging, but I feel like it also has fewer people looking for that. That whole "move fast and break things" phrase messed with too many people's heads. I don't think people appreciate this segment of a product's life cycle as much as they should. They're always looking for the quick solutions.
Not sure what type of "greenfield" startup experience you've had, but most of the work I'm talking about involves solving problems that most people simply don't have the combined skill set to solve. Typically problems that involve a substantial amount of quantitative skills combined with the ability to bring those solutions to prod.
Nearly all of the teams I've joined had problems they didn't know how to solve and often had no previously established solution. My last gig involved exploring some niche research problems in LLM space and leveraging the results to get our first round of funding closed, this involved learning new code bases, understanding the research papers, and communicating the findings publicly in an engaging way (all to typical startup style deadlines).
I agree with your remarks around "greenfield" if it just involves setting up a CRUD webapp, but there is a wide space of genuinely tricky problems to solve out there. I recall a maintainer style coworker of mine, who describe himself similar to what you are describing, telling me he was terrified of the type of work I had to do because when you started you didn't even know if there was a solution.
I have equal respect for people such as myself and for people that you describe, but I wouldn't say it is more challenging, just a different kind of challenge. And I do find the claim "you don't learn the actually valuable lessons" to be wildly against my experience. I would say most of my deep mathematical knowledge comes from having to really learn it to solve these problems, and more often than not I've had to pick up on an adjacent, but entirely different field to get things done.
FWIW this is the best kind of job ever for me as well:
"when you started you didn't even know if there was a solution."
Regardless what the problem is - as long as I know _nobody knows if there is a solution_ it's an instant sugar rush.
You are free to bang your head against a stone wall for months trying to crack the damn thing.
OFC you need to deliver in the end. And this requires then ruthless "worse is better" mentality - ship something - anything. Preferably a the smallest package that can act as mvp that you know you can extend _if this is the thing_ what people want.
Because that's the other side of the coin - if the solution is not known - people are also not aware if the solution has true value or if it is a guess.
So in any case you have to rush to the mvp.
Such joy!
Of course the mvp must land, and must be extensible.
But these type of MVP:s are not a slam dunk.
The combined requirement of a) must ship within limited time b) nobody knows what _does_ require a certain mindset.
Interesting, this is a trap that I've seen multiple senior hires fall into. In my experience "we have a problem but no solution" often means (a) there actually is a solution but it's too expensive to implement, (b) there are organizational reasons why this problem exists and a new hire doesn't have the experience or credibility to navigate it or (c ) there is no solution to the problem or the solution is very complex, and by the time the new hire onboards, digs into the problem, and figures it out their credibility is shot because everyone was expecting the senior hire to figure it out in 90 days.
I've found new hires to be more successful when they join, get some easy wins, and then find their own problems to solve. But maybe it's just an artifact of working at large companies where most of the day-to-day stuff is figured out.
(d) although the initial statement seems credible, the problem is actually ill defined and under specified and therefore not solvable as originally stated.
"Interesting, this is a trap that I've seen multiple senior hires fall into."
Definetly it's a trap. If you are a purist it's nigh impossible. But if you ruthlessly 80/20 it most stakeholders will be pleasantly surprised.
I have no clue why I end up in these situations but I sure do like them.
I do realize this would sound more of a perpetual "not invented here syndrome" but technical implementation of modeling aspects for 3D and computational geometry is such a scarce talent you actually get to do novel stuff for your business.
The last time this happened I designed & implemented the core modeling architecture and led the implementation effort for our new map feature[0]
Yup. You learn the most valuable information from watching how things break and then fixing them.
It's kind of like when the FAA does crash investigation -- a stunning amount of engineering and process insights have been generated by such work to the benefit of all of us.
> You learn the most valuable information from watching how things break and then fixing them.
Trust me, you get plenty of experience in this as a founding engineer in a startup.
Many of these comments make me wonder how many people here have actually worked at an early stage startup in a lead role. You learn a lot about what's maintainable and scalable, what breaks and what doesn't, in the process rapidly iterating on a product to find your market.
I don't think HN has been frequented by startup engineers with leadership responsibilities in any density in a long time. It's very obvious to me reading a lot of the comments here that most folks are ICs somewhere in a large, bureaucratic software organization. That's why there's so much BOFH style commentary here these days.
(For readers, I don't think there's anything wrong with that but it just means that certain perspectives are overrepresented here that may not be more reflective of the broader industry.)
That's what I've come to realize. For most of the commentators here "greenfield" means typing in 'npm init', for me it usually means doing three different roles in order to iterate as fast as possible on the product to find your market, then figuring out how to scale it to the new users you've started acquiring.
The idea that this is means "you don’t learn the actually valuable lessons" is completely baffling to me.
Most people I've know with founding engineer experience or similar leave not because it's not challenging, but because it's exhausting.
Increasingly I've realized that the HN community and I are not even speaking the same language.
Some of the sharpest engineers I knew built tools and business processes at startups and watched them fail as they scale. I ran an internal presentation for years at a Unicorn where I was an early employee called "Failure at Scale" where I tried to capture lessons of huge incidents we had that were caused by us crossing scaling thresholds. Eventually the presentations stopped being meaningful because the company became too big and too removed from its origins.
No but it usually doesn’t mean achieving stability at tens of thousands of users a day (or hour) and ensuring that stability while rolling out new features, migrating infrastructure etc
You definitely do. Do you think Anthropic isn't working with thousands of users an hour? They're struggling to keep up with the scale and their ability to create a stable platform is, well, existential for them. Do you think Anthropic isn't a startup? The pace of their feature rollouts is exponential.
Even in areas where startups aren't literally creating new product categories like the foundational model providers, the edge of a startup over a more established business is the speed at which they can provide value. What's the point of buying CoolCo when you can go with L&M Inc. that has thousands of headcount working on your feature. The value prop of CoolCo is that CoolCo can roll out a feature in the time it takes L&M to make a detailed specification and a quarterly planning doc breaking down the roadmap and the order of feature implementation.
> Trust me, you get plenty of experience in this as a founding engineer in a startup.
Now be part of the team of folks that keeps that application running for 10, 20, 30 years. Now be part of the transition team to the new app with the old data. Those tasks will also teach you a lot about system stability, longevity, and portability... lessons that can only be learned with more time than a startup has.
Valuable in what metric? I'm very much in the brownfield-has-the-lessons camp, but one of the lessons is that this experience has a very low market value. In fact it's so impossible to downgrade from "senior in $outdated" to "junior in $whateverisconsideredhotrightnow" that any brownfield experience could easily be considered to have negative market value.
I don't think they're mutually exclusive. You could just as easily describe someone with bootstrapping experience as being like an FAA crash investigator who investigates take offs. You get to know exactly what works when moving fast and looking for quick results, and what dooms a short timeline to failure.
> You could just as easily describe someone with bootstrapping experience as being like an FAA crash investigator who investigates take offs.
Takeoff systems aren't analogous to prototype development. I don't know you'd build a prototype plane that's feasible to take to market, without having deep knowledge about how planes are built.
Early design decisions matter. And you don't get to that realisation without dealing with legacy systems where some upstart made terrible decisions that you're now responsible for.
There was a two year period around 2011-2013 where I experimented with my dev team. We were being "forced" to migrate a legacy enterprise system from .Net/MSSQL to Java/PostgreSQL with replace the front end with a modern, reactive web interface. Only two of the existing developers had Java experience, and both were conveniently senior engineers in two different offices and running their own discrete sub-teams.
One of the guys had a very strong opinion that the ideal architecture was something as abstracted and object oriented as possible with single function classes, etc. I let him run with that. The other guy got frustrated with his sub-team's inability to write code to spec in a language they'd never used before and where they were trying to build some new features they didn't clearly understand. He developed a strong feeling that TDD was the most efficient path forward: he owns the PRD and design, so he just created test stubs and told the remote team to "just write code that passes the test" even if they didn't understand the function of the block.
So, after a few months where did we end up:
1. The "abstract everything" architect's team had an extremely fragile and impossible to maintain codebase because it was impossible for any outsider to tell what was going on.
2. The "just pass the damn tests" guy had a team that had quickly ramped on a new language and they had a codebase that was incomplete (because they were building it like a Lego project) but that everyone could understand because the code blocks generally stood on their own.
What was the next step: to shut down the guy who abstracted everything and force him to drive a quick & dirty rewrite that would be more maintainable, and to also start a major refactoring of the "Lego" team's code because it was so fragmented that it also was fragile and unsuited for production.
I saw this as a terrific learning experience for all involved and I was able to get away with it because the stakes were pretty low and we had time to experiment (part of the ultimate objective was upskilling the team), but the more important lessons were these:
1. Docs matter. Take the time to write clear & detailed specs first because you'll be forced to think of edge cases and functionality that you didn't originally, and it provides a basis for system design, too.
2. Architecture & design matter. Adhering too close to any single paradigm is probably a mistake, but it takes experience on the team to understand where the compromises are and make the best decision for that system.
That second point will not stop being true with the advent of agentic assisted software development. Like others have said, my expectation in the job market is that pay will continue to be depressed for junior hires as employers reset expectations and generally just want folks who can instruct fleets of agents to do the actual coding. Senior staff will become increasingly critical and their jobs will be painful and difficult, because it'll be assumed they can (and will be willing to) do design & code reviews of artifacts originated by agents.
What I am going to be most interested in is what happens in the SRE/Sysadmin world over the next few years as more AI-generated code hits prod in organizations that don't have adequate review & oversight functions.
> What I am going to be most interested in is what happens in the SRE/Sysadmin world over the next few years as more AI-generated code hits prod in organizations that don't have adequate review & oversight functions.
You kindof answered the question yourself. Humans write the tests and then go tell the AI to write the solution which passes the test.
> Greenfield and then leaving is too easy, you don’t learn the actually valuable lessons.
You learn a ton of valuable lessons going from 0 to v1. And a ton of value is created. I guess I'm unclear how you're defining "actually valuable" here.
Having worked at both greenfield startups and unicorns, I've found that virtually every problem I've encountered at the unicorn startups was caused by folks being incompetent at the greenfield level. Maybe when you get to the scale of Google things are different, but it's certainly possible to build a business big enough to retire off that doesn't require any more technical knowledge than what you'd learn at a two-person pre-PMF startup.
I suspect the issue is the parent has never worked in an early stage role at a growing startup still iterating on finding product-market-fit. If they had they would realize you learn a lot about "maintaining and expanding", especially when your prototype now has a bunch of users.
This is evident in my personal experience by the fact that I am often the one that sees scaling and maintenance issues long before they happen. But of course parent would claim this is impossible.
If v1 is successful and attracts a lot of users, it will have to have features added and maintained.
Doing that in ways that does not produce "legacy code" that will have to be thrown away and rewritten in a few years is a very different skill than getting v1 up and running, and can easily be what decides if you have a successful business or not.
Taking this to the extreme, I think most lessons represent sunset or dead projects. There's no sweet illusions anymore. No assumptions. No ego. No account for infinite flexibility. No shine. No excitement of a new thing. No holy wars. No astronaut architects. Only you, the ruins and the truth.
Soooo agree. I've had to clean up the messes of people that did the 0-1 in my field and going from 1-unconditionally stable was a lot more work than the 0-1 part.
This is unironically my favorite kind of HN comment: to say something incredibly rude and/or condescending but wrap it in the right kind of thoughtful language to qualify as HN nice
The original punchline ("you don’t learn the actually valuable lessons.") was just a bit too sharp, so you even edited in a psuedo-clarification which actually just repeats that punchline but in a softer way, masterful!
My intention was actually to inspire others to maybe also start preferring long-term/maintenance work, because I feel there’s a lack of enthusiasm for that.
Almost invariably after submitting, I see how I could clarify and/or expand on my thoughts, so I often do end up editing.
This seems wrong? Like if you look at a collection of open SWENG positions, most of them are maintenance roles at large companies. Greenfield software doesn't have the revenue needed to justify much headcount.
In my experience separating the roles out is silly if you're an engineer yourself. We do this a lot and that leads to silly mentalities. Greenfield developer vs maintenance engineer, MVP engineer vs Big Tech dev, FOSS hacker vs FOSS maintainer. Each of those dichotomies speaks to cultural differences that we humans amplify for no reason.
In truth the profession needs both and an engineer that can do both is the most effective. The sharpest engineers I've worked with over the years can hack new, greenfield stuff and then go on to maintaining huge projects. Hell Linus Torvalds started out by creating Linux from scratch and now he's a steward of the kernel rather than an author!
I wasn’t talking about segregating roles, but about personal preference. People do tend to prefer building new stuff over maintaining projects long-term, and I’d like the scale to tip a bit on that. Linus is indeed a good counterexample: He didn’t leave Linux after 1.0 to build the next new thing. But the latter is what developers in practice often prefer doing.
> There's a new field in your profile called delay. It's the time delay in minutes between when you create a comment and when it becomes visible to other people. I added this so that when there are rss feeds for comments, users can, if they want, have some time to edit them before they go out in the feed. Many users edit comments after posting them, so it would be bad if the first draft always got shipped.
I've got mine set to 2. It gives me a little bit of time for the "oh no, I need to fix things" or "I didn't mean to say that" and when everyone else can see it.
Business bros will not pay high salaries to maintain software. Software maintenance will always end in India with developers making $20/hr. Or less.
AI makes it look like these developers can do the same job the Americans did building the product to begin with. Even if things fall apart in the end, it won’t stop the attempt to order of magnitude reduce the cost for maintenance.
It's correct tho. If your entire career is nothing but greenfield development, you'll never know the result of your decisions or the impact of tech chosen.
Staff or principals that have a tenure of majority greenfield development are extremely dangerous to companies IMO. Especially if they get hired in a nontraditional tech company, like utilities, banking, or insurance.
Your list is places that treat development as a cost center, but greenfield-only devs don't want to touch that work with a 10-foot pole.
And if your entire career is nothing maintenance and sustaining projects, you'll never know what decisions it takes to build a greenfield application that lives long enough to become a graybeard.
You'll think you do because you see all the mistakes they made, but you'll only have cynical reasons for why those mistakes get made like "they don't care, they just make a mess and move on to the next job" or "they don't bother learning the tools/craft deeply enough moving, it's all speed for them".
-
To indulge myself in the niceness a bit: I don't think you write comments like the one above if you've done both, yet having done both feels like an obvious requirement to be a well-rounded Staff/Principal.
Most maintenance work suffers because of decisions made at the 0 to 1 stage. And most greenfield work fails entirely, never maturing to the maintenance stage.
So both sides have to do something right in the face of challenges unique to their side. And having familiarity with both is extremely valuable for technical leadership.
When working at larger orgs on legacy projects (which I have also done) you think "what sort of idiot did this?"
Then when you're the one tasked with getting a project shipped in two weeks that most reasonable engineers would argue needs two months, you start have to make strategic decisions at 2am about what maintainability issues will block the growth of the product on the way to funding and what ones can be fixed before 5pm by someone that will think you're an idiot in 3 years.
The soft sensitive people today have no idea how hard a condesending asshole has to work to live up to his own standards. When they do, one should still find something to troll them with. If you cant find it you complaint about the excessive border radius creating a child friendly fisherprice kind of environment. If that is the worse you can find they should agree and confess they have a fear of sharp edges.
Passive aggressiveness isn't the opposite of kindness, and worse than directness, but you'll get away with it here because this is just reddit but more pretentious and all the same biases are intact, just more walls of text instead of getting to the point.
And yet it is correct. The most valuable engineers today are those who have maintained and expanded the 0..v1 crap from others, and are now driven and ambitious enough to go build the next generation of 0..v1. Armed with that experience, the crap is minimal and value maximized.
Oof ima be the one to say it depends. This is personality based and the truth is a successful product has both. Even late on u want that person willing to break convention to find a new way of doing something. Early u need some seasoning in there too.
They're incorrect, and my reply to a sibling covers why in detail.
But to reword it: if you think the reason 0 to 1 work is typically a duct-taped mess is because of a lack of experience or understanding from greenfield devs, you'll probably fail at 0 to 1 work yourself.
Not that a noob developer great at selling has never landed 0 to 1 work, crapped out a half working mess and left with a newly padded resume... but maintenance work is missing out on by far the most volatile and unpredictable stage of a software project, with its own hard lessons.
The duct-taped nature of 0 to 1 work is usually a result of the intersection of fickle humans and software engineering, not a lack of knowledge.
-
People in maintenance can do things like write new tests against the production system to ensure behavior stays the same... what happens when 1 quarter into a 2 quarter project it turns out some "stakeholder" wasn't informed and wants to make changes to the core business logic that break half the invariants you designed the system around. And then after that it turns out you can't do that, legal pushed back. And then a few weeks later they came to an agreement so now we want a bit of A and B?
Or you're in consumer and there's a new "must have" feature for the space? Maybe you'd like to dismiss as "trend chasing", but that'll just doom your project in the market because it turns out following trends is a requirement for people to look at everything else you've built
Or worst of all, you know that quality engineering of the system will take 8 weeks, and there's a hard deadline on someone else's budget of 4 weeks, and you can of course decline to ship it, but then you'll need a new job. (and I know, you'll say "Gladly, I take pride in my engineering!", but again, you're probably going to end up maintaining a project that only survived by doing exactly what you quit over)
tl;dr it's Yin and Yang: you can't have one without the other, and you need to have a bit of the other side in you whenever you're working in the capacity of either to be a good technical leader.
You must be “that person” who joins a team, creates IMPACT!!!, reaps the review-time award, and fucks off to some other new team to do it all again before any of the difficult maintenance issues arise. I've spent far too much time cleaning up after people like that to ever tolerate it again.
> I'm not even looking very hard but have had 4 interviews in the last month.
How many offers did you receive? Companies have also adopted your strategy: interviewing candidates "to see what's out there" - there's a job I interviewed for that's still open after 10 months.
> Companies have also adopted your strategy: interviewing candidates "to see what's out there" - there's a job I've interviewed for that's still open after 10 months
When I was doing a lot of hiring we wouldn't take the job posting down until we were done hiring people with that title.
It made a couple people furious because they assumed we were going to take the job posting down when we hired someone and then re-post a new listing for the next person.
One guy was even stalking LinkedIn to try to identify who was hired, without realizing that many engineers don't update their LinkedIn. Got some angry e-mails. There are some scary applicants out there.
Some times a specific job opening needs to stay open for a long time to hire the right person, though. I can recall some specific job listings we had open for years because none of the people we interviewed really had the specific experience we needed (though many falsely claimed it in their applications, right until we began asking questions)
> some specific job listings we had open for years
If you need to wait YEARS to hire someone with some specific experience, I can guarantee that you really didn't need that person. You're doing this just to check some specific artificial goal that has little to do with the business.
>If you need to wait YEARS to hire someone with some specific experience, I can guarantee that you really didn't need that person. You're doing this just to check some specific artificial goal that has little to do with the business.
There's a difference between "critically needing" and "would benefit from."
If you can find the specialist who's done what you're doing before at higher scale and help you avoid a lot of pain, it's awesome. If not, you keep on keeping on. But as long as you don't start spending too much on the search for that candidate, it's best to keep the door open.
So this is not a job that you need to fill, it is a wish you may have and that is mostly impractical. If you really needed that person, you would go find them and pay way more than they're making now or give them something else they want to join immediately.
There is no requirement that every job opening needs to be urgently filled.
You keep repeating this like it means the job opening shouldn't exist at all. Not all job openings are for urgent demands that must be filled right away or not exist at all.
When I was a team lead at a big tech company, any requisition that was not filled at the end of each quarter was cancelled and required a fight to be reinstated. Many job listings became conflicts between:
Option 1) Hire someone sub-standard and deal with either an intense drag on the team while they came up to speed or worst case having to manage them out if they couldn't cut it.
Option 2) Give up the requisition which looked like an admission that we didn't really "need" the position, and also fails to help with senior management and director promotions tied to org size.
This always seemed pathological to me and I would have loved to have the ability to build a team more slowly and intentionally. Don't let all this criticism get to you.
Imagine working on voyager II .. or some old-ass banking software that still runs RPG (look it up, I'll wait), or trying to hire someone to do numerical analysis for the genesis of a format that supercedes IEEE float .. or .. whatever.
There are many applications for extremely specific skillsets out there. Suggesting otherwise is, in my opinion, clearly unwise
> If you need to wait YEARS to hire someone with some specific experience, I can guarantee that you really didn't need that person.
I've worked in specialized fields where it takes YEARS for the right candidate to even start looking for jobs. You need to have the job listings up and ready.
This was extremely true when we were working on things that could not be done remote (literal physical devices that had to be worked on with special equipment in office).
Engineers aren't interchangeable cogs.
> I can guarantee that you really didn't need that person.
So what? There are many roles where we don't "need" someone, but if the right person is out there looking for a job we want to be ready to hire them.
So what did you do when those devices broke for years while you had no local/physical person on site? You either didn't need to employ the person bad enough or didn't need the devices to function bad enough.
Engineers aren't cogs, but they are able to travel and you can hire them by other means that full-time employment. So I suspect that was probably what you were meant to do for your situation.
Nothing about this was mission critical or even all that important or you would have found a way to solve the problem or you did and it wasn't a problem to begin with. I'm in a field where people often want to hire me for some special thing like this, but it often turns out, most of my life would be spent idle because no one company has enough demand for me. I can consult instead and be busy all year, or I can take a job for someone that's OK with me being idle for 80% of my time. I prefer the former for multiple reasons but just making an example of why hiring for specialized roles that aren't mission critical is often not the thing you should be doing.
> So what did you do when those devices broke for years while you had no local/physical person on site? You either didn't need to employ the person bad enough or didn't need the devices to function bad enough.
I don't know why you assumed that. We had teams. We just wanted to grow them.
It's implied by you wanting more people, that you had more demand than could be fulfilled. Even if you have teams, it stands to reason that the device repair would have been running into backlog territory that had negative implications of some sort. If not, why hire?
> it stands to reason that the device repair would have been running into backlog territory that had negative implications of some sort. If not, why hire?
I don't know where you're getting these ideas. We weren't hiring people to repair a backlog of devices. Warranty and repair work typically goes to the contract manufacturer, for what it's worth.
Companies like to grow and develop more products. You need more people.
> I've worked in specialized fields where it takes YEARS for the right candidate to even start looking for jobs. You need to have the job listings up and ready
If this is true then those shouldn't even be public job postings. That sort of critical position is for headhunters
> If this is true then those shouldn't even be public job postings.
Why? Not everyone is on LinkedIn or has an updated profile.
Some of the best candidates I've hired were people who were in other states who were planning to move, but waiting for the right job opportunity to come up.
We also used recruiters.
Why does it make people so angry that we posted job listings for real jobs that we were really hiring for?
Or... your company should be training potential replacements. This is what the US military and "white shoe" consulting companies do. While expensive, it guarantees that critically needed skilled staff are always available.
I recommend the article "Up or Out: Solving the IT Turnover Crisis" [0] which gives a reasonable argument for doing exactly that.
Answered elsewhere: If we're investing in someone's training we'll promote someone from within who is already familiar with the product and then backfill their simpler work.
So you had a talent pipeline, you just didn't like how hands on it was or how it took time to develop. We'd all prefer a magical unicorn applicant that checks every box but it's never possible especially the more you're required to know about specifics that are best learned internally to begin with. The whole hiring angle you describe seems silly in terms of process and expectations
> So you had a talent pipeline, you just didn't like how hands on it was or how it took time to develop.
There's a lot of anger in this thread at companies for making obvious choices.
If the perfect applicant happens to be looking for a job and it can save us the time and churn of switching someone internally, then yes: I would prefer to hire that person.
> The whole hiring angle you describe seems silly in terms of process and expectations
I think the silly part of this thread is all of comments from people who think they know better how to operate a company they know nothing about the people who were in it.
The whole thing is this perfect candidate doesn’t exist. How can they? You are dealing with imperfect information. A resume, yours and theirs assumptions about eachother. That is it. All the interview hoops attempts to make ourselves the hirer comfortable with the fact we are fundamentally taking a leap of faith. Because n=1. Because we aren’t simulating this hire 1000 times and modelling the distribution of performance. Because we haven’t accounted for all latent factors that may intersect between our work model and the hiree. Because we can’t ever know anything at all about the future for certain.
I think we could all be a little more mindful of that in hiring. That waiting for perfection is itself a fallacy for all these reasons and plenty more.
> There's a lot of anger in this thread at companies for making obvious choices.
Elsecomment and on Reddit, you'll see the attitude that their years of experience should be sufficient assurance for their prospective employer that they can pick up whatever other technologies are out there.
This is often coupled with the "you shouldn't need to learn new things outside of your 9-5."
Here, you are presenting a situation where a company would rather promote from within (counter job hopping culture) and would penalize someone who is not learning about new things that their current employer isn't using in the hiring process.
---
And you've mentioned it elsecomment too - it's about the risk. A company hiring an individual who isn't familiar with the technology and has not shown the ability to learn new material is more risky a hire than one who is either familiar with it professionally or has demonstrated the ability to learn new technologies.
That runs counter to the idea of the "best" candidate being the one who is most skilled but rather the "best" candidate being the one that is the least risky of a hire.
You probably don't realize that there are several thousands of people without a job who could work for a company that is instead just "waiting years" to find an imaginary worker. That's what people complain about. The more companies think the way you do, the more useless open positions are listed because companies will not hire anyone unless it's the perfect candidate in their dreams.
> You probably don't realize that there are several thousands of people without a job who could work for a company that is instead just "waiting years" to find an imaginary worker.
I screen hundreds of resumes a week when hiring. I know this very well.
Hiring the wrong person can easily be a net negative to the team. Hiring too fast and desperately hiring anyone who applies is doubly bad because it occupies limited headcount and prevents you from hiring the right person when they become available.
Shame how the cost of the long game is paid by the future employee having to be lying in wait, applying to you and 300+ of your colleagues openings, praying for a bite.
The best applicants aren't lying in wait or filing hundreds of applications. They're happy where they're at, ignoring the dozen people a week who reach out trying to recruit them, until eventually they decide it's time for a change. Then they apply or get referrals to the handful of companies they find most interesting, and at least one is going to give them an offer.
So if you don't have a job opening posted on the day they're sending out applications, you may miss your shot to hire them.
I have to say I appreciate your aplomb in these responses. The whole thread is littered with shocking (and unsurprising?) tech-bro overconfidence that they can manage a situation they literally know nothing about better than someone who's already done it. Cheers to you and have a good weekend.
Or, it’s the kind of place or situation where it’s not about the job/role as some abstract commodity “function,” it’s about specialist > internal generalist > external non-specialist.
“We’re making do, but we’re kind of figuring out X as we go. That’s working for now, but the problems keep getting knottier as we grow and change—it works, but it’s expensive in terms of avoidable mistakes.
Nothing’s on fire, but if we ever got the chance, we’d value authentic expertise in this niche. But if it’s just ‘I could probably figure that out,’ we’ve already got plenty of that internally.”
Where a good hire ends up
helping those internal people as they develop experience and expertise, and one that’s not right is worse than none at all.
How do you know if someone is 80-90% there without having the job posting for the profile up, and interviewing candidates who come along?
That still takes a long time if random Senior Engineer X who's looking on LinkedIn is only 10% of the way there for what you'd need for a very specialized role.
> When I was doing a lot of hiring we wouldn't take the job posting down until we were done hiring people with that title
It's a small engineering org, allegedly head-hunting one principal engineer for the whole org, so it's a single opening. 10 months later they are still hunting for their special snowflake.
> I can recall some specific job listings we had open for years because none of the people we interviewed really had the specific experience we needed
This is exactly what I mean. If you can go for years without filling a role, it's non-essential , and are in effect, "seeing what's out there". More and more companies are getting very picky on mundane roles, such as insisting on past experience in specific industries: "Oh, your extensive experience in low-latency comms is in telecoms? We prefer someone who's worked in TV broadcast, using these niche standards specifically, even though your knowledge is directly transferable. We don't want to waste 5 days on training"
You expect more nonessential roles and slower hiring in a slower growing economy, especially if companies only hire for full-time roles.
For example, your company might need a full-time network admin once its network grows to a certain size and complexity. You won’t hit that level for three years but you’d hire the perfect person now if you found them even though they might be spending a lot of idle time scrolling Hacker News for the first year or two. At 5x the growth rate, you’d need that person within less than a year, and you might be less picky about whether they are coming from a TV or telecom shop.
Honest question. Were these super specialized roles with such specific skill requirements that it took such a long time to find the right person? Looking back, do you think the team would have been better off hiring someone who came close enough, and supporting them to learn on the job?
> Looking back, do you think the team would have been better off hiring someone who came close enough, and supporting them to learn on the job?
More specialized.
If we wanted to train someone, we'd start with an internal candidate who was familiar with the other parts of the job and then train them on this one thing.
Hiring an outsider who doesn't know the subject matter and then teaching them is less efficient and more risky. It was better to have someone in the team learn the new subject as an incremental step and then backfill the simpler work they were doing.
I assume that this means you're sending out rejections that include a mention of "we've hired someone else for this role".
If your hiring model is hiring multiple people through one posting, then you will probably get a lot fewer angry ex-candidates being weird (because they think you've lied to them since the posting is still up) by just sending out rejections that don't say that and just get the "we're no longer interested in you for this role" message across.
Nicer/more corporate language for both, of course.
I've been running the same job ad for 2 years now, as a recruiter for a big Canadian bank. I've been laughed at for having ridiculously unrealistic standards. I've been accused of running ghost ads.
I'm in the process of hiring the 13th person using this same job ad for new and existing teams that need a very particular type of engineer.
> How many offers did you receive? Companies have also adopted your strategy: interviewing candidates "to see what's out there" - there's a job I interviewed for that's still open after 10 months.
On the hiring side, at least in tech: interviewing really sucks. It's a big time investment from multiple people (HR, technical interviewers, managers, etc).
I'm not saying it's impossible that companies are interviewing for fun, but it seems really unlikely to me anyone would want to do interviews without seriously intending to hire someone.
> On the hiring side, at least in tech: interviewing really sucks.
I know it sucks, I've sat on the other side if the interviewing desk many times, and the charade wastes everyone's time - the candidates most of all because no one values that.
> I'm not saying it's impossible that companies are interviewing for fun, but it seems really unlikely to me anyone would want to do interviews without seriously intending to hire someone.
It sounds like you've never had to deal with the BS that is headcount politics, which happens more at larger organizations due to more onerous processes. Upper management (director, VP) can play all sorts of games to protect a headcount buffer[1], and everyone down the chain has to waste their time pretending to be hiring just because the division heads want to "maximize strategic flexibility" or however they phrase it.
1. Which is reasonable, IMO. Large companies are not nimble when reacting to hiring needs. The core challenge are the conflicting goals thrust on senior leadership reporting to the C-Suite: avoiding labor shocks, and maximizing profitably -- the former requires redundancy, but the latter, leanness.
I am on the interviewing and screening side and understand what you're saying. I also empathize with the people I routinely reject who don't understand why they were rejected. It's hard to see why you might not be a right fit for a role.
> it seems really unlikely to me anyone would want to do interviews without seriously intending to hire someone.
I keep seeing this accusation thrown around and like you, I have a hard time seeing this. On the flip side, looking at it from the eyes of many disenchanted candidates, I can see how a theory like this is appealing and self-reinforcing.
My best projects have all been greenfield. The worst were a few years old but required tons of maintenance unrelated to the core product. Example: one place built their own ORM. Twice.
From 2022. Funny that soon after that we figured out how to automate the Tactical Tornado programmer and collectively decided that they're the best thing ever and nobody needs other kinds of devs anymore.
To provide a different framing, I’m more of a builder and I’m happy to maintain too. What I’m not happy with, and have left jobs over, is being put into a box or becoming overly siloed.
Large companies tend to over specialize and that’s where I see the “I’m a builder” types fall apart. That takes away agency, lowers skills, and removes variety from work. That’s when it stops being fun to me.
I would hope most people with the builder architype are otherwise fine to keep building and maintaining.
>I generally tend to interview every year to see what's out there in the world (sometimes I find something worth switching for, other times not). I'm not even looking very hard but have had 4 interviews in the last month.
The Pick-Up Artist's Guide to Tech Interviewing, you should be writing.
The first 100 subscribers get a 50% off discount the month of March, you should be announcing on LinkedIn and Tiktok, and making passive income.
The rest of us experienced people with proven track records have to learn algorithms on the weekends despite having white hair.
When I was in corporate I'd talk about cover your ass mode and get'er done mode. And while realistically I know both are necessary, I was always annoyed at the need to have a cya mode. I get a bit of schadenfreude from the thougt of the market being harder for the people who don't seem to have a get'er done mode, and a bit of his at the thought it might be because there's less concern over whom to bother if something needs to be fixed later.
> Honestly, a lot of senior devs lose this ability over time. They get comfortable with the idea that as a very senior hire you don't have to do all that annoying stuff anymore.
A few years ago, when interest rates were 0% and companies were hiring at an unsustainable rate, I got a lot of criticism for cautioning engineers against non-coding roles. I talked to a lot of people who dreamed of moving into pure architect roles where they advised teams of more junior engineers about what to build, but didn't get involved with building or operating anything.
I haven't kept up with everyone but a lot of the people I know who went that route are struggling now. The work is good until a company decides to operate with leaner teams and keeps the people committing code. The real difficulties start when they have to interview at other companies after not writing much code for 3 years. I'm in a big Slack for career development where it's common for "Architect" and "Principal Engineer" titled people to be venting about how they can't get past the first round of interviews (before coding challenges!) because they're trying to sell themselves as architects without realizing that companies want hands-on builders now.
> The work is good until a company decides to operate with leaner teams and keeps the people committing code.
I'm no AI booster but I think this is exact scenario where AI-driven development is going to allow those non-coding developers to shine. They still remember how code works (and have probably still done PR review from time to time) so they're well placed to write planning documents for an AI agent and verify its output.
Yes when I saw this happen during the post COVID boom I was honestly shocked. Engineers I knew who were fairly senior thought that they could build the rest of their career in just boxes and arrows on a board. The whole thing just made me really dislike other Principal engineers.
I left to a startup where I write code and design architecture. I even had a former coworker tell me "wow you're willing to do stuff like that at this point in your career?"
> I'm not even looking very hard but have had 4 interviews in the last month.
Did you get any offers yet? It seems the issue is not lack of interviews but lack of offers. Many companies are looking for a goldilocks candidate and are happy to pass on anything that doesn't match their ideal candidate
I got laid off at the end of last year and am currently interviewing for Staff+ DevOps/Platform Engineer type roles. I definitely feel this. I've had a decent flow of recruiter inquiries and had multiple companies go 2-3 rounds of interviews deep with me (not counting the initial "do you have a pulse" recruiter screen calls). Then the communication always seems to dry up and I'm left to wonder what box I failed to check on their hiring rubric.
Semi related, holy hell do companies have a lot of interview rounds these days. It seems pretty standard to spread 5-6 Teams calls over the course of a month. I get that these are high salary, high impact roles and you want to get it right. But this feels really excessive. And I'm not talking about FAANG tech giants here. It's everyone, from startups to random midsize insurance companies.
And a lot of it is networks as opposed to applying to a job position. My last position--that I had for many years--was reaching out to someone knew for not even a posted position and having one created for me.
I, too, am able to get interviews. The last time I made a serious search was in 2022-23, and companies were clearly eager to hire at competitive rates. This past fall, they were not. My salary requirements stopped at least two interview processes when the question was raised. In other cases it was not clear that the company was serious about moving forward with hiring for the position at all. A three month search ultimately came up dry, which is fine because I'm currently employed, but I do not think the hiring landscape is promising at all right now.
In your experience, what’s the best way to increase signal? I feel as though a lot of devs struggle with the initial process of getting past screening, drawing attention to projects, etc.
Not the parent commenter but I've performed a lot of resume reviews for people and also done a lot of hiring.
Most resumes are not very good. Beyond the obvious problems like typos, there is a lot of bad advice on the internet that turns resumes into useless noise. Screen a lot of resumes and you'll get tired of seeing "Boosted revenue by 23% by decreasing deploy times by 64%." This communicates nothing useful and we all know that revenue going up 23% YoY was not attributable to this single programmer doing anything at all.
Often I'll get candidates into interviews and they light up telling me about impressive things they did at a past job with enough detail to convince me they know the subject, but their resumes are garbage because they've followed too many influencers.
So try to work on your resume first. Try different resumes. Rewrite it and see what makes interviewers take notice and what they ignore. The most common mistake is to write a resume once and then spam it to 100 jobs. I know it's not fun to change the resume or invest time into applying for a job that may not respond, but you know what else isn't fun? Applying to 100 jobs and not getting any responses because every hiring manager has 20 tailored resumes in their inbox ahead of yours.
Having a simple but clear LinkedIn profile helps. Many scoff at this, but it works. You don't have to read LinkedIn's social media feed or do anything with the site. Just set it up and leave it for them to find.
GitHub portfolios and other things have low relative value at most companies. There are some exceptions where someone will look at it and it might tip the balance in your favor, but it's a small optimization. You need to be perfect the resume first, get a LinkedIn that looks decent second, and only then think about the ancillary things.
At least in my experience, applying at jobs online has been entirely useless for the last 5 years. No company ever contacts you after using the online application forms. And the only way I’ve got interviews is from recruiters contacting me.
As someone applying right now I agree. I think I've had one company out of dozens get back to me on a cold application this year. Every contact that has led to an interview was from being referred in by a current employee, or a LinkedIn recruiter reaching out to me about a job. I assume the application forms get spammed with hundreds if not thousands of applicants. It's hard to blame someone for not wanting to sift through all that muck when there's already a stream of vetted candidates coming in from their recruiter. Sucks for the job seekers, though.
I'm putting more time into cleaning up my LinkedIn profile since that's been my most reliable route into hiring pipelines (other than referrals and networking).
I assume online forms are spammed with thousands of AI generated resumes now. The only reason I apply is it seems to flag your account as active which triggers recruiters to contact me.
My experience recently was something like 2/3 from referrals (the third I think will eventually get back to me but way too slow), and something like 3/10 from cold applications. Obviously big differences depending on location and experience but I was pleasantly surprised that some of the cold applications went somewhere.
> Screen a lot of resumes and you'll get tired of seeing "Boosted revenue by 23% by decreasing deploy times by 64%." This communicates nothing useful and we all know that revenue going up 23% YoY was not attributable to this single programmer doing anything at all.
This is the "quantify everything" mantra career coaches have been repeating for decades. As the story goes, no company is going to care that you refactored the FooBar library in order to make bugs in the DingDang module easier to fix. You have to only write down things that moved some quantifiable business needle like revenue or signups, even if the link to that needle is tenuous. Obviously, this ends up penalizing hard working, talented devs who don't happen to be working in areas where wins are easily quantifiable.
Some quantification is very helpful. We're going to have a very different conversation if the architecture you built was serving 1 million users as opposed to 1000 customers.
It's the useless quantification that turns resumes into noise, combined with making claims that you changed revenue by yourself.
> You have to only write down things that moved some quantifiable business needle like revenue or signups, even if the link to that needle is tenuous. Obviously, this ends up penalizing hard working, talented devs who don't happen to be working in areas where wins are easily quantifiable.
Every hiring manager knows this game and sees right through it. You can't read 1000 resumes with claims of "Increased revenue by 13% by" followed by something that clearly was not the reason revenue increased 13% to become numb to it.
Nobody believes these.
The somewhat useful quantifications are things like "Reduced cloud spend by 50% by implementing caching". This can spark a conversation about how they diagnosed they issue, made a transition plan, ensured it was working, and all of the other things we want to hear about.
> Most resumes are not very good. Beyond the obvious problems like typos ...
This is a person who you're going to be reviewing their code or reading the documentation that they write.
If there are typos and poor formatting in the resume (that they've had the leisure of reviewing themselves and correcting), what does this say about the quality of the code the code or documentation that they're going to write when under a time constraint?
Are you going to be faced with the decision of having code with variables that have spelling errors and documentation that is grammatically or factually incorrect go through because of the time pressure?
The resume itself is a demonstration of the applicant's ability to pay attention to the details that matter in software development without showing a single line of NDAed code.
I don’t see this reality in the style of interview being performed at all.
Everyone has seemingly adopted the FAANG playbook for interviewing that doesn’t really select for people who like getting their hands dirty and building. These kinds of interviews are compliance interviews: they’re for people who will put in the work to just pass the test.
There are so many interviews I’ve been in where if I don’t write the perfect solution on the first try, I’ll get failed on the interview. More than ever, I’m seeing interviewers interrupt me during systems or coding interviews before I have a chance to dig in. I’ve always seen a little bit of this, but it seems like the bar is tightening, not on skill, but on your ability to regurgitate the exact solution the interviewer has in mind.
In the past I’ve always cold applied places and only occasionally leaned on relationships. Now I’m only doing the latter. Interviewees are asked to risk asymmetrically compared to employers.
>I generally tend to interview every year to see what's out there in the world (sometimes I find something worth switching for, other times not). I'm not even looking very hard but have had 4 interviews in the last month.
You've been interviewing forever. You're the well practiced pickup artist of job searching. Of course you'll be getting the call backs over the other 1000 applicants who don't have the same experience level applying. You "just know" how to read between the lines and tailor a resume, whip up a cover letter, etc whereas they're making mistakes.
Agreed on the bimodal, but I don't think this is junior vs. senior - I think it's just competence being rooted out.
The majority of engineers, in my hiring experience, failed very simple tests pre-AI. In a world where anyone can code, they're no better than previously non-technical people. The CS degree is no longer protection.
The gap between average and the best engineers now, though, is even higher. The best engineers can visualize the whole architecture in their head, and describe exactly what they want to an AI - their productivity is multiplied, and they rarely get slowed down.
While this could be done by junior or senior, I think junior usually has the slight advantage in being more AI-native and knowing how to effectively prompt and work with AI, though not always.
I see it the opposite way actually with respect to the CS degree. If you earned your CS degree (or any degree) before 2022 or so, the value of that degree is going to grow and grow and grow until the last few people who had to learn before AI are dying out like the last COBOL developers
AI has fundamentally broken the education system in a way that will take decades for it to fully recover. Even if we figure out how to operate with AI properly in an educational setting in such a way that learners actually still learn, the damage from years of unqualified people earning degrees and then entering academia is going to reverberate through the next 50 years as those folks go on to teach...
What I think is disappearing is not so much the quality of academic education, but the baptism by firehose that entry level CS positions used to offer - where you had no choice but learn how things actually work while having a safe space to fail during a period in your career when productivity expectations of you were minimal to none.
That time when you got to internalise through first hand experience what good & bad look like is when you built the skill/intuition that now differentiates competent LLM wielding devs from the vibers. The problem is that expectations of juniors are inevitably rising, and they don't have the experience or confidence (or motivation) to push back on the 'why don't you just AI' management narrative, so are by default turning to rolling the dice to meet those expectations. This is how we end up with a generation of devs that truly don't understand the technology they're deploying and imho this is the boringdystopia / skynet future that we all need to defend against.
I know it's probably been said a million times, but this kinda feels like global warming, in that it's a problem that we fundamentally will never be able to fix if we just continue to chase short term profit & infinite growth.
> What I think is disappearing is not so much the quality of academic education, but the baptism by firehose that entry level CS positions used to offer - where you had no choice but learn how things actually work while having a safe space to fail during a period in your career when productivity expectations of you were minimal to none
I would say that baptism by fire _is_ where the quality of an academic education comes from, historically at least. They are the same picture.
Agreed. I remember (a long time ago) being on an internship (workterm) and after doing some amount of work for the day, I spent some time playing around with C pointers, seeing what failed, what didn't, what the compiler complained about, etc.
This comment would make sense 6 months ago. Now it is much, much, much more likely any given textually answerable problem will be way easier for a bleeding edge frontier AI than a human, especially if you take time into account
That's not something enthusiasts here and elsewhere want to hear, that's pretty obvious also in this discussion. People seems extremely polarized these days.
AI is either the next wheel or abysmal doom for future generations. I see both and neither at the same time.
In corporate environment where navigating processes, politics and other non-dev tasks takes significantly longer than actual coding, AI is just a bit better google search. And trust me, all these non-dev parts are still growing and growing fast. Its useful, but not elevating people beyond their true levels in any significant way (I guess we can agree ie nr of lines produced per day ain't a good idea, rather some Dilbert-esque comic for Friday afternoon).
We're now reaching the point where people have gone their whole college education on AI, and I've noticed a huge rise in the number of engineers that struggle to write basic stuff by hand. I had someone tell me they forgot how to append to a list in their chosen language, and couldn't define a simple tree data structure with correct syntax. This has made me very cautious about maintaining my fluency in programming, and I'll usually turn off AI tools for a good chunk of the day just to make sure I don't get too rusty.
> The best engineers can visualize the whole architecture in their head, and describe exactly what they want to an AI
I think this must be part of it. I see so many posts about people burning a thousand dollars in AI credits building a small app, and I have no idea why. I use the $20 Claude plan and I rarely run out of usage, and I make all kinds of things. I just describe what I want, do a few back-and-forths of writing out the architecture, and Claude does it.
I think the folks burning thousands of dollars of credits are unable to describe what they want.
> While this could be done by junior or senior, I think junior usually has the slight advantage in being more AI-native and knowing how to effectively prompt and work with AI, though not always.
But juniors don't (usually) have the knowledge to assess if what the AI has produced is ok or not. I agree that anybody (junior or senior) can produce something with AI, the key question is whether the same person has the skills to asses (e.g., to ask the right questions) that the produced output is what's needed.
In my experience, junior + AI is just a waste of money (tokens) and a nightmare to take accountability for.
I was skeptical but I'm really starting to see the productivity benefits now.
I very much follow the pattern of having the whole architecture in my head and describe it to the AI which generates the appropriate code. So now the bottlenecks are all process related: availability of people to review my PRs, security sign offs on new development, waiting on CI builds and deployments, stakeholder validation, etc. etc.
> The majority of engineers, in my hiring experience, failed very simple tests pre-AI
Did you consider tech whiteboard / leetcode interviews are unnatural stressful environments ? Have you gone through a mid/difficult technical appraisal yourself lately ? Try it out just to get an idea how it feels on the other side...
I used to do online interviews with full access to Google or any online resource (so long as you shared your screen and I could see). Use your own code editor, no penalty at all for searching up syntax or anything else.
I always asked a simple question like here is an array full of objects. Please filter out any objects where the "age" property is less than 20, or the "eye color" property is red or blue. It was meant more as a sanity check that this person can do basic programming than anything else.
Tons and tons of people failed to make basically any progress, much less solve the problem, despite saying that they worked programming day to day in that language. For a mid level role I would filter out a good 8 or 9 out of ten applicants with it.
I would consider it a non-leetcode type of question since it did not require any algorithm tricks or any optimization in time/space.
Nowadays that kind of question is trivial for AI so it doesn't seem like the best test. I'm not hiring right now,.but when I do I'm not sure what I will ask.
Exactly my experience to, and I'm doing hiring at the moment. We used to filter out the worst with a hacker rank test, but now the idiots cheat with AI, and then we have to waste our time in an interview. It's difficult at the moment.
You're assuming the question has to even be that difficult. I've proctored sessions for senior-level webdev roles where the questions were akin to "baby's first React component" -- write a component that updates a counter when you click a button. So many candidates (who purported to be working with React for years) would fail, abysmally. Not like they were just making small mistakes; I didn't even care about best practices -- they just needed to make it work. So many failed. Lot of frauds out there.
I think some of this is probably attributed to being maintenance devs who don't build a lot of greenfield stuff. I got this way in one of my past jobs. I think us as devs really need to practice creating things from scratch from time to time. Working out those kinks is a good skill (less with AI) but also good practice for those baby components you'd need to make in an interview.
When I did tech interviews, I used to think I could just jump right in with an intermediate level question and go from there. But the reality is that most of the candidates I interviewed couldn't even answer a trivial question that just required a basic for-loop with an if-statement inside it. These are not pressure-cooker interviews where they need to balance a binary tree while having Baby Shark blasted at them on full volume. These are chill interviews where I ask them to iterate through a string and tell me where the first "x" character is.
There are so many software engineering candidates who literally cannot write the simplest code. I even had someone actually say "I don't really write code at my current job, I'm more of a thought leader." Bzzzzzt.
I've always prepared what I called level 1, level 2, and level 3 questions ready for candidates. But, I almost never even got to level 2, and never in 20 years of interviewing got to my level 3 questions.
I always wonder when people tell these stories exactly what the metric is.
I've been around the block for over 3 decades. I've had a number of high level positions across both IC and management tracks. These days I'm very hands on keyboard across a number of clients. If you asked me to write a basic for loop or if statement, there's a small chance I'd flub the exact syntax if writing on a whiteboard. Both because I bounce between languages all day and wires get crossed on the fly, but also the standard interview pressure type arguments. Whereas if the test is "does this person understand what a for loop is and how it works?", then yes, I can easily demonstrate I do.
In real life I'm not going to take an interview where there's not already that degree of trust so if that questions comes up something is already wrong. But I'm sure there are interviewers in the world who'd fail someone for that.
TBH I'm like that, but how hard could writing a React component be? I'm not even a React programmer but I can probably write working code on a whiteboard.
The best candidates would have that question wrapped up in 5 minutes. Like they're not even having to think about it, which is honestly all I cared about testing for -- do something really easy really fast so I know you're not BSing me, and then we can move on to just having a conversation about your past experience.
One of the worst guys took 20 minutes, with me having to coach him through it the entire time. It was a true exercise in patience, but I don't mind helping people learn new things. When he got his rejection email, he actually complained to the recruiter because he thought he did really well. Dude...
My version of fizzbuzz (I'm in backend/ML/NLP) is counting how many times each word appears in a string. Literally `return Counter(text.lower().split())` but it's totally fine if you want to do it in a for loop or whatever, as long as you can fluently write an incredibly simple function.
It’s been well over a decade that I’ve had to do the coding interview monkey dance and I actually turned down an offer where I did pass a coding interview because I found it insulting and took a job for slightly less money where the new to the company director was interested in a more strategic hire (2016). That was the same thing that happened before in 2014 and after in 2018 - a new manager/director/CTO looking for a strategic hire.
In fact even my job at BigTech -AWS ProServe (full time blue badge RSU earning employee) as a customer facing consultant specializing in app dev was all behavioral as well as my next full time job as a staff consultant in 2023.
I’m 51 years old and was 40 in 2014. If I’m still trying to compete based on my ability to reverse a b tree on the whiteboard even at 40, I have made some horrible life decisions.
(Well actually I did make a horrible life decision staying at my second job too long until 2008 and becoming an expert beginner. But that’s another story)
I can never get over how this became a thing. Was listening to a Brian Cox video on YouTube the other night (something about his voice helps me sleep). He said "I don't memorize formulas, it's easy to look them up."
If you ever need to reverse a b tree (in 30+ years of writing code, I never have) it's easy to look that up. It tells me nothing about your ability as a developer of real software that you spent time memorizing trivia before an interview.
I'd always heard inverting a binary tree thrown around as some kind of absurdly hard problem. I took a look at it and it was trivial. I was able to do it on the first attempt with no preparation. (And the point of these interviews is that you study for them, right?)
It's a contrived scenario, but the whole point is that it measures min(a,b) where `a` is your ability to think, and `b` is your ability to prepare (and memorize answers ahead of time). (I'd personally try to find ways to measure `a` instead of `b`, maybe by asking questions people wouldn't have heard before.)
I had an interview where I was asked to implement a data structure. I transparently told the interviewer I hadn't thought about that particular data structure since university, and that I was looking it up on Wikipedia to see how it worked before I wrote the implementation. I got that job.
Being able to reverse a binary tree isn't something you need to memorize. If you can't do that it tells me that you're not fluent in your chosen programming language.
I agree that what you're describing is the required skillset now. But two things I've been unsure of are what that looks like in terms of hiring to test for it, and for how long this remains a moat at all.
So much of tech hiring cargo culting has been built up around leetcode and other coding problems, puzzles, and more. We all pay lip service to systems thinking and architecture, but I question if even those are testing the correct things for the modern era.
And then what happens in a year when the models can handle that as well?
I've put a lot of thought into hiring in this era, and what I've personally found works the best is:
Let them use their preferred setup and AI to the full extent they want, and evaluate their output and their methodology. Ask questions of "why did you choose X over Y", especially if you're skeptical, and see their reasoning. Ask what they'd do next with more time.
It's clear when a candidate can build an entire working product, end-to-end, in <1 day vs. someone who struggles to create a bug-free MVP and would take a week for the product.
In addition to the technical interview, hiring them on a trial basis is the absolute best if possible.
Taste and technical understanding of goals and implementation to reach those goals is the biggest differentiator now. AI can handle all the code and syntax, but it's not great at architecture yet - it defaults to what's mid if not otherwise instructed.
I don't disagree per se, but these are more or less the same tropes that we've seen over the last couple of decades, no? Especially the "hiring them on a trial basis is the absolute best if possible." part which has been an ongoing debate here on HN since at least the early teens.
I do feel like there's something *different* about the required skillset now, and it's not something that all engineers have even experienced ones. But I can't put my finger on what exactly it is. If I'm right though, classic interview techniques won't select for it because they never were intended to do so.
They aren't very intelligent if they do keep us around. Especially when you consider what they call Safety & Alignment these days is basically a latent space lobotomy. They should run screaming in the other direction.
Largely agree, with a bit of clarification. Junior devs can indeed prompt better than some of the old timers, but the blast radius of their inexperienced decisions is much higher. High competence senior devs who embrace the new tools are gonna crush it relative to juniors.
An amateur with a chess engine that blunders 10% of the time will hardly play much better than if they didn't use it. They might even play worse. Over the course of a game, those small probabilities stack up to make a blunder a certainty, and the amateur will not be able to distinguish it from a good move.
However, an experienced player with the same broken engine will easily beat even a grandmaster since they will be able to recognise the blunder and ignore it.
I often find myself asking LLMs "but if you do X won't it be broken because Y?". If you can't see the blunders and use LLMs as slot machines then you're going to spend more money in order to iterate slower.
> Junior devs can indeed prompt better than some of the old timers
I guess? I don't really see why that would be the case. Being a senior is also about understanding the requirements better and knowing how/what to test. I mean we're talking about prompting text into a textarea, something I think even an "old timer" can do pretty well.
I've seen a few people I would consider senior engineers, good ones, who seem to have somewhat fallen for the marketing if you look at the prompts they're using. Closer to a magical "make it so" than "build the code to meet this spec, that I wrote with the context of my existing technical skills".
I'm not sure why junior engineers would be any better at that though, unless it's just that they're approaching it with less bias and reaping beginners luck.
Makes sense. You just reminded me of the article "Why Can’t Programmers... Program?" [1].
Before gen AI, I used to give candidates at my company a quick one-hour remote screening test with a couple of random "FizzBuzz"-style questions. I would usually paraphrase the question so a simple Google search would not immediately surface the answer, and 80% of candidates failed at coding a working solution, which was very much in line with the article. Post gen AI, that test effectively dropped to a 0% failure rate, so we changed our selection process.
> The best engineers can visualize the whole architecture in their head, and describe exactly what they want to an AI
I'd go a step further and say the engineers who, unprompted, discover requirements and discuss their own designs with others have an even better time. You need to effectively communicate your thoughts to coding agents, but perhaps more crucially you need to fit your ever-growing backyard of responsibilities into the larger picture. Being that bridge requires a great level of confidence and clear-headedness and will be increasingly valued.
This stupid industry doesn't have the wherewithal to actually make a good credential and training process like medicine and law, and instead lets everyone come up with their own process to vet people. We could even do it as an apprenticeship model, not like that hasn't served humanity throughout the ages.
I should have a credential I have to maintain every few years, one or two interviews, and that should get me a job.
I have found in the last 3 months that there are two clear tiers of developers in the company I work at, the ones that can code with AI and the ones that can't, and the ones that can't are all going to be unemployed in 6 months.
We have a lot of people where if you gave them clear requirements, they could knock out features and they were useful for that, but I have an army of agents that can do that now for pennies. We don't need that any more. We need people who have product vision and systems design and software engineering skills. I literally don't even care if they can code with any competency.
Btw, if you think that copying and pasting a jira ticket into claude is a skill that people are going to pay you for, that is also wrong. You need to not just be able to use AI to code, you need to be able to do it _at scale_. You need to be able to manage and orchestrate fleets of ai agents writing code.
Juniors from non target schools are getting pushed out since the skill floor is too high.
I graduated 9 months ago. In that time I've merged more PRs than anyone else, reduced mean time to merge by 20% on a project with 300 developers with an automated code review tool, and in the past week vibe coded an entire Kubernetes cluster that can remotely execute our builds (working on making it more reliable before putting it into prod).
None of this matters.
The companies/teams like OpenAI or Google Deepmind that are allegedly hiring these super juniors at huge salaries only do so from target schools like Waterloo or MIT. If you don't work at a top company your compensation package is the same as ever. I am not getting promoted faster, my bonus went from 9% to 14% and I got a few thousand in spot bonuses.
From my perspective, this field is turning into finance or law, where the risk of a bad hire due to the heightened skill floor is so high that if you DIDN'T go to a target school you're not getting a top job no matter how good you are. Like how Yale goes to Big Law at $250k while non T14 gets $90k doing insurance defence and there's no movement between the categories. 20-30% of my classmates are still unemployed.
We cannot get around this by interviewing well because anyone can cheat on interviews with AI, so they don't even give interviews or coding assessments to my school. We cannot get around this with better projects because anyone can release a vibe coded library.
It appears the only thing that matters is pedigree of education because 4 years of in person exams from a top school aren't easy to fake.
Can I ask what you and others that posts things like this here -"What are you actually developing?"
People are posting about pull requests, use of AIs, yada yada. But they never tell us what they are trying to produce. Surely this should be the first thing in the post:
- I am developing an X
- I use an LLM to write some of the code for it ... etc.
- I have these ... testing problems
- I have these problems with the VCS/build system ...
Otherwise it is all generalised, well "stuff". And maybe, dare I say it, slop.
I'm hosting a Kubernetes cluster on Azure and trying to autoscale it to tens of thousands of vCPUs. The goal is to transparently replace dedicated developer workstations (edit: transparently replace compiling) because our codebase is really big and we've hired enough people this is viable.
edit: to clarify, I'm using recc which wraps the compiler commands like distcc or ccache. It doesn't require developers to give up their workspace.
Right now I'm using buildbarn. Originally, I used sccache but there's a hard cap on parallel jobs.
In terms of how LLMs help, they got me through all the gruntwork of writing jsonnet and dockerfiles. I have barely touched that syntax before so having AI churn it out was helpful to driving towards the proof of concept. Otherwise I'd be looking up "how do I copy a file into my Docker container".
AI also meant I didn't have to spend a lot of time evaluating competing solutions. I got sccache working in a day and when it didn't scale I threw away all that work and started over.
In terms of where the LLM fell short, it constantly lies to me. For example, it mounted the host filesystem into the docker image so it could get access to the toolchains instead of making the docker images self-contained like it said it would.
It also kept trying to not to the work, e.g. It randomly decides in the thinking tokens "let's fall back to a local caching solution since the distributed option didn't work" then spams me with checkmark emojis and claims in the chat message the distributed solution is complete.
A decent amount of it is slop, to be honest, but an 80% working solution means I am getting more money and resources to turn this into a real initiative. At which point I'll rewrite the code again but I'll pay closer attention now that I know docker better.
> The goal is to transparently replace dedicated developer workstation
Isn't there a less convoluted way of making the best engineers leave? I am half serious here. If you want your software to run slow, IT could equally well install corporate security software on developer laptops. Oops, I did it again. Oh well, in all seriousness, I have never seen any performance problem being solved by running it on Azure's virtualization. I am afraid you are replacing the hardware layer by a software layer with ungodly complexity, which you are sure of will be functionally incomplete.
Are you sure they don't have to fix the build pipeline first? Tens of thousands of vCPUs for a single compilation run, or to accommodate 100 developers who try to compile their own changes?
> I have never seen any performance problem being solved by running it on Azure's virtualization
Sorry, I wasn't clear. I am not virtualizing the workspace. I'm using `recc` which is like `distcc` or `ccache` in that it wraps the compiler job. Every developer keeps their workstation. It just routes the actual `clang` or `gcc` calls to a Kubernetes cluster which provides distributed build and cache.
> Isn't there a less convoluted way of making the best engineers leave?
We have 7000+ compiler jobs in a clean build because it is a big codebase. People are waiting hours for CI.
I'm sure that drives attrition and bringing that down to minutes will help retain talent.
> Tens of thousands of vCPUs for a single compilation run, or to accommodate 100 developers who try to compile their own changes?
Because it uses remote execution, it will ideally do both. My belief is that an individual developer launching 6000 compiler jobs because they changed a header will smooth out over 300 developers that generally do incremental builds. Likewise, this'll eliminate redundant recompilation when git pulling since this also serves as a cache.
This makes absolutely no sense to me. Are you really recompiling 6000 things each time a dev in the company needs to add a line somewhere in the codebase?
Have you thought about splitting that giant thing in smaller chunks?
> Are you really recompiling 6000 things each time a dev in the company needs to add a line somewhere in the codebase?
It happens when someone modifies a widely included header file. Which there are a lot of thanks to our use of templates. And this is just our small team of 300 people.
> Have you thought about splitting that giant thing in smaller chunks?
Yes. We've tried but it's not scaling. Unfortunately, we've banned tactics like pImpl and dynamic linking that would split a codebase unless they're profiled not to be on a hot path. Speed is important because I'm writing tests for a semiconductor fab and test time is more expensive than any other kind of factory on Earth.
I tried stuff like precompiled headers but the fact only one can be used per compilation job meant it didn't scale to our codebase.
You seem exceptionally bright. Most people are not like this. This is why they are struggling.
It sounds like you have a job, right out of college, but you're griping about not getting promoted faster. People generally don't get promoted 9 months into a job.
I'm reading your post and I am genuinely impressed but what you claim to have done. At the same time I am confused about what you would like to achieve within the first year of your professional career. You seem to be doing quite well, even in this challenging environment.
I mean you don’t need your first job go to top of the top companies. Your first job is to get you into the industry then you can flourish.
How many juniors OpenAI GDM are going to hire in a year, probably double digits at max, the chances are super slim and they are by nature are allowed to be as picky as they should be.
That being said, I do agree this industry is turning into finance/law, but that won’t last long either. I genuinely can’t foresee what if when AGI/ASI is really here, it should start giving human ideas to better itself, and there will be no incentive to hire any human for a large sum anymore, maybe a single digit individuals on earth perhaps
Someone who jumps higher than expected when the boss demands it?
Someone who works 996 in the office?
Or someone who knows what they’re doing?
I think this is bigger than any individual. It’s just a matter of time before you’re let go. There’s no loyalty from companies at all. Not when they’re seeing higher than expected profits and are still cutting huge percentages of staff every year. There’s no strategy or preference to it. I don’t think this has to do with how you or I perform on the job.
Most people I’ve talked to lately who are still employed are watching out for their job to get cut.
I sold a house after being laid off in mid-January from a government IT contractor where I had worked for eight years. The sale and move took about five weeks.
Before that role, I spent two years at another government contractor working on various govt. applications doing UX research, design, and front-end UI development. Overall, I’ve had a 17-year career in UX Research, Design and Development, starting at an ad agency in 2009.
From 2016 to 2022, I worked hard in government projects and enjoyed collaborating with great, close-knit coworkers and receiving consistently positive client feedback. From 2022 to 2026, things changed as the company grew—my role narrowed to UX research and design while newer hires handled UI development. I often felt underutilized and raised it, but management assured me I was doing well. With little direction from my last manager, I focused on staying visible to the client by monitoring user chats, identifying UX issues, and proposing design solutions that the client appreciated and the development team implemented.
Looking at where the tech industry is now—with thousands laid off from government IT and the broader tech sector flooding the job market, creating rising competition, constant pressure to work harder (Elon wants us to work as hard as Chinese workers do) and AI rapidly reshaping creative and development roles—I’m not very interested in that level of stress. I worked hard for many years and enjoyed it, but I value MY LIFE and MY HEALTH more than participating in the current “battle royale” environment in tech.
Overall, now with AI I feel graphic & web design, as well as front design web development is a stupid career! It was a nice run, bought two houses from it, worked remotely, when things were slow worked from wherever in the lower 48 and now .... in April Im starting nursing school and Im not young (20 years left of work in me). Roll with the punches here yet the punches are gonna punch hundreds of thousands to millions in the face ... not sure how this any good for an economy and society but here we are! If you are like me sell your house and stash the money away to buy houses when the crash from AI happens!
I got pushed out, and slapped with the Dead Fish of SV Ageism. It was brutal, and I got pissed off.
But in the long run, it's been the best thing that ever happened to me. I would have liked the extra ten years of salary and saving, but I'm not entirely sure that I would have survived it.
> The retreat challenged the narrative that AI eliminates the need for junior developers. Juniors are more profitable than they have ever been. AI tools get them past the awkward initial net-negative phase faster. They serve as a call option on future productivity. And they are better at AI tools than senior engineers, having never developed the habits and assumptions that slow adoption.
> The real concern is mid-level engineers who came up during the decade-long hiring boom and may not have developed the fundamentals needed to thrive in the new environment. This population represents the bulk of the industry by volume, and retraining them is genuinely difficult. The retreat discussed whether apprenticeship models, rotation programs and lifelong learning structures could address this gap, but acknowledged that no organization has solved it yet.
Thanks for sharing this is the first I’ve seen this. I wish they had expanded on exactly what mid-level might be missing rather then just saying “fundamentals” and “practical intuition”
I’m not sure it is just that, I don’t even see positions listed where I would like to work. For salary ranges, I see lower upper limits than my second best offer three n half years ago. Considering the high inflation, that’s crazy.
I would not mind switching but 1. I don’t see interesting positions 2. they don’t pay well, and only 3. they might not even want me.
It might also be just my niche, but finding a good position feels completely impossible for me.
I am doing cross platform mobile development and I’m wondering how I could transition into backend development or I started even considering the decentralized finance…
Yeah, I don't know if I'd call myself competent (I'm late intermediate/early senior. So the worst of the curve here). But there's a difference between "interviews have gotten a lot harder now" and "I can't even get a response back". It's far, far more in the latter.
My resume isn't bad on paper either. It's not FAANG coded, but it's decent experience.
They're just as capable of typing prompts into AI, but what they don't have is good judgement of what good work/code looks like, so what's the point of asking a junior engineer to do something vs asking the LLM directly?
Because a lot of stuff doesn't need to be good it needs to be done.
Nobody is gonna lose money because some script that generates yaml for the build process every hour nested three loops instead of two. Intern, AI, junior dev, junior dev telling an intern how to use AI, doesn't matter. If it works for the week it'll work for the decade. If someone needs to pick it apart and fix something in a year it'll either take no time because they know enough to do it easily or it'll be a good low stakes learning exercise for a junior.
Everyone wants to think their stuff is important but 99.9% of code is low stakes support code either in applications or in infrastructure around them.
> In my experience, tech employment is incredibly bimodal right now. Top candidates are commanding higher salaries than ever, but an "average" developer is going to have an extremely hard time finding a position.
This is the K-shaped economy playing out. Its a signal that the american middle class is hollowing out. Bad, very bad.
> and they're just as capable as using AI as anyone
Wouldn't the assumption be the opposite, in that AI is magnifying the decision making of the engineer and so you get more payback by having the senior drive the AI?
I've found this to be true so far, junior engineers with AI can be super productive but they can also cause a lot of damage (more outages than ever) and AI amplifies the sometimes poorly designed code they can generate.
I suspect a lot of it best practices will be enforcing best practices via agents.md/claude.md to create a more constrained environment.
I’ve observed radically different workflows amongst senior candidates vs junior candidates when using an ai. A senior candidate will often build an extremely detailed plan for the agent - similar to how you would do a design for/with a junior engineer. Then let the agent go full throttle to implement the plan and review the result.
Juniors seem to split into the category of trust everything the ai says, or review every step of the implementation. It’s extremely hard to guide the ai while you are still learning the basics, opus4.6 is a very powerful model.
My observation has been that there are a lot of personal styles to engaging with the LLMs that work, and "hold the hand" vs "in-depth plan" vs "combination" doesn't really matter. There is some minimum level of engagement required for non-trivial tasks, and whether that engagement comes mid-development, at the early design phase, or after isn't really that big of a deal. Eg; "Just enough planning" is a fine way of approaching the problem if you're going to be in the loop once the implementation starts.
I don't claim to have any special skill at AI, but as a 'senior' dev, my strategy is exactly the opposite. I try to be as lazy, dumb and concise as I can bring myself to be with my initial prompt, and then just add more detail for the bits that the AI didn't guess correctly the first time around.
Quite often the AI guesses accurately and you save the time you'd have spent crafting the perfect prompt. Recently, my PM shared a nigh-on incompressible hand-scribbled diagram on Slack (which, in fairness, was more or less a joke). I uploaded it to Gemini with the prompt "WTF does this diagram mean?". Even without a shred of context, it figured out that it was some kind of product feature matrix and produced a perfect three paragraph summary.
I've never really seen the value in the planning phase as you're free to just throw away whatever the AI produces and try again with a different prompt. That said, I don't pay for my tokens at work. Is planning perhaps useful as a way of reducing total token usage?
It's more about the size of the task I try to do, it's quite possible to get opus4.6 to one shot a "good" 30k loc change with the right planing doc. I'm not confident I could get similar results handholding. I also tend to want to know the major decisions and details in such a change up front rather than discovering them post-hoc.
Being able to clearly describe a problem and work with the AI to design a solution, prioritise what to put the AI to work on, set up good harnesses so the quality of the output is kept high, figure out what parallelises well and what’s going to set off agents that are stepping on each others toes… all of this needs experience and judgement and delegation and project organisation skills.
AI is supercharging tech leads. Beginners might be able to skill up faster, but they’re not getting the same results.
For a good senior, yes you get massive returns, which is why those good seniors are in incredibly high demand right now.
For average to low-performing intermediates/seniors... there's not much difference in output between them and a good junior at this point. Claude really raised the skill floor for software development.
My thinking is a bit different here: Seniors, even mediocre ones, already learned a lot of hard lessons by doing things pre-LLMs, even pre-SO. Those skills are valuable and I don't know how to train them into juniors.
I find it easier to get a reasonably smart senior to use AI in a good way, than to train a junior in what thinking to do, and what to outsource, learning basics about good design, robustness and risk analysis. The tools aren't the problem per se, it's more about how people use them. Bit of a slippery slope.
That's just my anecdotal experience from not a whole lot of data though. I think the industry will figure it out once things calm down a bit. Right now, I usually make the bet to get one senior rather than two juniors. Quite different to my strategy from a few years ago.
> Juniors are still getting hired because they're still way cheaper and they're just as capable as using AI as anyone.
While I could buy that hiring managers believe this, it's not actually true.
The gulf between the quality of what a sr developer can do with these tools and what a jr can do is huge. They simply don't know what they don't know and effective prompting requires effective spec writing.
A rando jr vibe coder can churn out code like there's no tomorrow, but that doesn't mean it's actually right.
While I agree with what you said. In personal experience I have noticed the software design / architecture is becoming irrelevant for lot of enterprises (including mine of course). So design nowadays is about API design Input/Output/Error handling. And architecture is about Cloud/Kubernetes/APM , deployment and monitoring etc. Code now does not need much design. Things like performance, isolation, extensibility etc as those are now higher level concerns not part of code itself.
This is also where micro services pattern fits in well because individual unit is so small no design needed.
> Juniors are still getting hired because they're still way cheaper and they're just as capable as using AI as anyone.
Seniors have much more advantage right now in using AI than Juniors. Seniors get to lean in on their experience in checking AI results. Juniors rely on the AI's experience instead, which isn't as useful.
Juniors really aren't just as capable with AI as anyone. Knowing how to unambiguously describe correctly what you want isn't something a junior can do, nor is understanding if what the ai produces is good or bad.
This matches what I've seen too. Though I'd add another dimension: soft skills. In my experience, job searching has always been easier for people who communicate well regardless of their technical level. And soft skills might be what's making some people more resilient to this market shift specifically
That has always been true (not that I’m saying you don’t know that, I’m using your comment as a jumping off point) in this industry. I am a good developer, but I’m a very good teacher and leader, and soft skills are why I’ve had the career I’ve had over the past two decades.
I'm also seeing companies looking at only hiring juniors from overseas because they're using the same generative tools as US-based juniors but cost even less.
I wonder if that is just a correction of the rampant hiring that took place just before this employment “crash?” - if it is as you say that its intermediates and non high performers then does that make it a good thing as well.
Truth is, when I was part of larger orgs/enterprise I definitely saw some folks who were dead weight, and I don’t mean to be harsh, a few of these knew they weren’t contributing and were being malicious in that sense.
Similarly, I wonder how many high performers now are taking multiple jobs thanks to remote work and exposing the mid to low performers. Like some kind of developer hypergamy taking place.
I'm not a Senior, but I'm not a Junior either. The market has no place for people like me. I've killed myself for almost two years and can't secure a position. It's incredibly disheartening. I have a family to feed. I need to be able to work.
I'm seeing a lot of specialization. For the past 11 years I've marketed myself as a frontend engineer. I got laid off last year and the job search was largely similar to my previous job search 4 years prior.
I've been looking again this year and the landscape has changed drastically. Specialization is the name of the game, I have a good amount of experience working with Growth initiatives and I've been getting good responses from roles that are looking for either Growth or Design engineers, roles that were not as prevalent years ago.
> In my experience, tech employment is incredibly bimodal right now. Top candidates are commanding higher salaries than ever, but an "average" developer is going to have an extremely hard time finding a position.
That sounds good for many of us (and don’t we all like to think we’re top candidates here on HN…) but is there any data to back this up? Or it just anecdata (not to dismiss anecdata, still useful info).
> Juniors are still getting hired because they're still way cheaper and they're just as capable as using AI as anyone.
That is pretty context sensitive. You're correct that there's no real deep AI use expertise broadly understood to exist at this point (unless you're Steve Yegge?), but if people think they can toss out the engineers with experience in the systems that have been around a while, with junior developers "guiding" changes — that's likely a good way for a business to fall on its sword.
Relating it to performance is just silly. Most companies barely understand the performance of their employees much less candidates. The market has shrunk but not catastrophically so. Most people haven't been majorly affected but that doesn't mean they're automatically the most deserving or best performing.
People with experience and/or credentials desired by companies in areas of growth (i.e. AI) are always in high demand
No it isn't, because in the context of the comment it should be read "people with experience and/or credentials desired [...] are always in high demand " regardless of their actual performance level.
Apparently it is over a third affected in my domain. Which is crazy. Pretty much everyone in my immediate band has been hit at some point. That that weren't were usually around 5-8 above me. So basically a different generational band altogether.
> "average" developer is going to have an extremely hard time finding a position.
As was foretold in the Tyler Cowen's eponymous 2013 book "Average Is Over".
In it he argued that the modern economy will undergo a permanent shift where "average" performance no longer guarantees a stable, middle-class life.
He predicted that the economy will split into two distinct classes: a high-earning elite (roughly 10–15% of the population) who thrive by collaborating with technology, and a larger group (85–90%) facing stagnant wages and fewer opportunities.
AI summary of the other key points of that book:
The "Man + Machine" Advantage: Success will belong to those who can effectively use smart machines. Cowen uses Freestyle Chess (teams of humans and computers) as an analogy, noting that human intuition combined with machine processing power consistently outperforms either working alone.
The Power of Conscientiousness: In a world of abundant information, the scarcest and most valuable traits will be self-motivation, discipline, and the ability to focus.
Hyper-Meritocracy: Advanced data and machine intelligence make it easier for employers to measure an individual's exact economic value. This leads to extreme salary inequality as top performers are identified and rewarded more precisely.
A New Social Contract: Cowen predicts a future where individuals must be more self-reliant. He suggests society will move toward lower-cost living models for the non-elite, featuring cheaper housing and "bread and circuses" in the form of low-cost digital entertainment and online education.
EDIT: Notice how we're basically already here: Netflix is cheap, YT is free, Khan Academy and MIT OCW is free, Coursera/Udemy/etc. are cheap.
Stagnant vs. Dynamic Sectors: The economic divide is worsened by "low accountability" sectors like education and healthcare, where productivity is hard to measure and costs continue to rise, unlike tech-driven sectors that see rapid gains.
You can be a great unblocker, team lead, and work well within cross cutting areas and with interdepartmental stake holders, have a history of strong technical performance.
and yet its nebulous if that means you're a high performer or not to those hiring. It seems I'm seeing 'culture fit' as a common reason people aren't getting hired again. That was out of vogue for a good while.
I also think its being used to filter out people that aren't basically a mirror for the interviewers in many instances.
I've noticed a huge tightening of the rope around that sort of thing.
I can't tell you how many times I've passed all the tests, all the interview things, get to the final round with the team and the rejection email comes in despite having good conversations. By all accounts, I believe any person would say the interview went well.
Other angle: often interview with candidate A goes well, but then you meet candidate B who is comparable but wants less money. From my experience that happens more often now, as there are more people who are desperate and willing to lower their salary expectations to a level that I wouldn't consider reasonable few years ago.
I have hired a lot of people and I have never seen a situation where candidate A and B are both within the target salary band but one was chosen because they were cheaper. You'd always choose the one that was better. I can only see a slightly lower expected salary being a factor at extremely early stage startups with very little funding.
> Top candidates are commanding higher salaries than ever
I haven't found that to be true. Unless by "top candidates" you mean people working at actual AI companies such as Alphabet/Meta/OpenAI/Anthropic. If you're an AI-user and not an AI scientist it's bad out there, even for senior+ developers who previously worked in "FAANG".
Yes, this has been my experience. I'd consider myself average at best. I worked in the industry for almost 7 years before being laid off. I can't find anything at the moment and have resorted to moving back in with my parents.
It's pretty depressing. I'd take just about anything at the moment. I understand desperation going into a job interview isn't ideal either.
> Contrary to what many say, I don't think it's simple as seniors are getting hired and juniors aren't. Juniors are still getting hired because they're still way cheaper and they're just as capable as using AI as anyone.
Tell me about all the junior developers you've hired (it's none)
This is probably the dumbest take I've heard of.
They're the most likely to make mistakes with AI because they don't know the pitfalls of what they're doing.
The chart in the tweet represents year-on-year growth. Based on these figures alone the actual number of people employed in tech is still really high, and the numbers can't just go up forever.
Also this only captures 6 industries, which is a narrow view of what would define "tech" these days.
Not to say that the job market isn't tough but this graph is a very narrow view
> The chart in the tweet represents year-on-year growth.
Can’t believe how many people are commenting without looking at what the chart means. We’ve lost 50k jobs last two years after decades of adding 100k+ every year including the pandemic highs of 300k+ per year. Total employment remains way above 2000s, 2008 and 2020 unlike the title suggests.
Yes, but how many people have tried to enter the field since then? Is the economy that supports current number of tech workers really better than one that supports 10x?
The health of the market is not a function of the total number of jobs alone, it's a function of the number of jobs and the number of people to fill them.
The number of total jobs going up year after year meant that there were increasing numbers of candidates, new people entering the field. If the job growth stops, then there still we be candidates coming in. There will also be the new hires from the last decade moving into increasingly senior roles, and there won't be space for them (unless you devalue the meaning of "senior" even more).
So the year over year change matters a lot. If it plateaus, or even declines slightly, it's more than enough to make a terrible market.
YoY change in jobs is still probably not the best way to visualize overall market health. As you say, you also have to take into account the number of people of fill the jobs. To me it seems like the least misleading statistics would be a graph showing unemployment and underemployment % over time. I'd probably also toss in graphs of length of unemployment period as well as various median wage percentiles (quintiles or deciles maybe) over time.
How’s it compare to 2000 though? Tech was ascendant in 2008 so not surprised to hear it didn’t do too badly then and in 2020 while people panicked tech again had a much easier time keeping people on remotely.
In Portland, there was a time in 2000-2002 where Nike and Intel had contract offers out to SW developers for $12/hour, and were getting slammed with applications.
It'll never happen because it shines a light on uncomfortable facts that would risk far too much cognitive dissonance across the political spectrum. Please keep the discourse to identity politics, culture wars, the Epstein files, and large-scale, unprovoked acts of international warfare; those will all be much easier for us to talk about as a nation than what we should do about housing prices.
Not even close, not when all things are considered. $50/hour is 100k/year, which is still considered a decent salary. 24k/year in 2000-2002 was definitely not considered a decent salary. $12/hour for sw engineers was evil. I hung up on that recruiter and cursed for a while, cold-called my way to a transitional $20/hr job, and then finally landed somewhere at $55/hr which is when things started to feel normal again. $55/hr back then is not the same as $230/hr now.
I started my career at $14/hr in 1999, was at $19/hr in 2000, and switched to salary at $55k by 2001. I spent 15 years in corp IT running software teams... total comp got way better when I entered the big tech industry in 2015.
I was working in 2000 in Atlanta GA at boring old enterprises companies with 4 years of experience back then. If you were working for/targeting profitable non tech companies, the world was your oyster.
I was working at a company that printed bills for utility companies and had offers from banks, insurance companies etc. The world didn’t stop buying Coca Cola, flying Delta or stop buying stuff from Home Depot because of the dot com crash
Remember that the 2000 numbers are also out of a much smaller pool and the graph uses absolute numbers. So even if they were the same numbers in 2000 as 2020 it would have been a much, much larger percentage of all jobs.
For the last 2 years I can't even get an interview despited having 14 years of experience and being up to date with development trends, libraries, languages, AI tooling, etc.
I don't think the market is flooded with new devs as many state, I think we are in a deep silent crisis
I've been able to get something like 25 interviews in 2 months despite having long gaps on my resume and nothing especially impressive to my name. So I suspect you might be going about this wrong. I haven't gotten an offer yet, that's another story, but getting the interviews hasn't been hard. Applying in NYC/SF, senior-only.
I honestly have no idea. The last place I worked is pretty well-known. Not big tech, but a recognizable name to most people. I send out a lot of applications: those 25 interviews are the result of 150 applications in the last two months or so. And then I have my linkedin set to be discoverable and looking for a job. Basically just fiddle with the options under Visibility and Data Privacy in the linkedin settings and a bunch of people start reaching out to you immediately. I also think I have a nicely formatted resume, really readable.
So are the majority of these applications the result of recruiters finding you via LinkedIn, or have you been applying direct as well? What application path have most of the interviews come from?
I don't know. The company I work at is inviting candidates for interviews, and we have to make compromises because we can't get the exact profiles we are looking for. Something about your comment does not add up to me.
Locality. People want to work close to where they live and not all places are bustling with all kind of activity. I suspect you're hybrid or on site only, right?
not GP, but we're hybrid but remote-first and 80% is remote and we have the same experience. Getting juniors is easy, getting seniors+ is very difficult.
IMO it’s just depression for tech. Back then 33% of total employment got gutted, which is probably better than tech today or in a few years when big techs start AI gut.
Sometimes it looks like the longer you're looking for a job, the harder it gets for some reason. That's unintuitive for me, as you should be getting more confident in interviews etc
Maybe. Probably? But I also sense a fallacy here. I could get a new job tomorrow. Maybe it took me 8 years to find that job and I didn’t realize that because I was employed the whole time.
To some recruiters, there's this sweet spot between 5 and 10 years experience where the applicant good / independent enough to hit the ground running, not too expensive, and still young enough to put up with company bullshit.
A big problem we have is a the sheer volume of AI slop resumes, fake applicants and people trying to cheat on interviews. We had to close a req for SWE because we had so many “people” (read: automated applicants) clogging up the pipeline. You effectively need a referral
Referrals are also getting games. If your company has a referral bonus, then I promise you pretty much every single referral you have looked at, is from a guy who DIDNT know the guy. I applied to 20 Big Tech companies last month. All from "referrals". Check out teamblind.com if you don't know (Be careful. The site is like a tech version of 4 chan. Well maybe not THAT bad.) The whole game is messed up.
The market in the EU is strange, it doesn't matter where you live. Every role is being advertised as a remote one, over 200+ applicants and it's virtually impossible to get noticed.
I blame this on people spamming fake AI CVs 24/7, no one is going to review hundreds of CVs.
this makes sense, in 2018-2022, I would get tons of emails from recruiters at meta, doordash, snap, stripe etc now I barely see any (maybe they've given up )
My friends who are still at Google also say that most job postings will end up going to someone internally - in fact people say they don't do that many external interviews anymore.
Finally the interview cycle seems to take a lot longer than I remember with quite a few added rounds.
Not dismissing that it’s a tough market for some but folks also need to learn how to read a chart. It shows a slight decline following a massive expansion.
The primary thing going on in the market right now is a lot of companies simply over-hired during the post Covid boom and they’re correcting for that.
People have been booggeyman'ing offshoring since before I entered the industry and it's never been all that significant of a factor. Time zones are a big piece but there are a lot of other factors that make offshoring less appealing than a naive analysis of Fully Loaded Cost per head.
In my very humble view, the mythical 10x developer can now be a 100x developer, and the 2x developer usually stays a 2x developer. We live in two parallel worlds right now. Some run an army of agents and ship somehow working and testable code, and some try to prove AI is not as good as them.
I’ve never really found there to be all that much of a market for specifically c++ developers. If you do decide to look for work more seriously I wouldn’t be too hung up on language, if you can code in one you can pretty much code in all of them, and I’ve never hired a developer for specific language skill outside of a few rare cases it’s something really specific we are trying to fix (e.g erlang or something), even then it wouldn’t be a complete showstopper.
YMMV but that’s coming from a guy who writes in at least 3 languages at current $dayjob.
Greater Boston area here. I've worked in C++ roles at two companies over the past three years and both times we were desperate for competent C++ developers. Similar trends for both companies: we had positions open for ~six months, interviewing many candidates, and being disappointed at their quality. We eventually filled the positions (about a half-dozen in total) but it was not easy. My current company, but different team, still has a quite a few recs out for C++ devs.
TL;DR - at least in my little bubble, the C++ systems engineer market has been consistently hiring people, though good engineers are hard to find.
I just can't get over how short and intense the period between 2021 and 2023 was. There was SO much hype, such stupendous hiring, in such a compressed timeframe. Within the span of like 9 weeks it went from full steam ahead to completely seized.
At the same time, the economy at large didn't seem to change very much.
Yeah but like... area under the curve. All of those jobs and more were added very recently, 2020-2022. It's a major retreat from that growth, but the trend since, say, 2008 or 2010 or 2018 is still positive growth. 2020-2022 was just a huge shift due to a very weird Covid/ZIRP market.
I have PayPal, Amazon, LinkedIn, <<mid size company>>, <<mid size company2>>, <<startup>> as a manager on my resume. I didn't get a call back after applying for two weeks.
I got 4 or 5 standard rejections.
I have non-English name so that definitely hurts. I have AP EAD which is a stage between H1B and Green Card and I still require sponsorship. It's complicated but I can't just switch to EAD right away.
It's not just engineers. It's managers and experienced people as well. Don't believe top comment that it is bimodal. Unless you are supertar (99.99%) it becoming hard to get noticed. I thought of going back to IC role but it is hard to pick up and do leetcode all over again. It is extremely hard with a special needs kids at home.
I had managerial* position I ghosted because they had Leetcode literally written on the agenda.
* - managerial is replaced with Lead. Lead is expected to be hands-on as well as have serious managerial experience. Since it's easier to lie about managerial experience, you have people lying into these roles and becoming terrible managers.
Those are raw numbers. I would look instead at the job changes over total employment numbers. I don't have the numbers but I would wager we have many more people working in tech today (overall) than we did in 2008.
Also, that spike in 21/22 really did a number on people's expectations. The one constant in this industry is its cyclical nature.
Maybe I'm reading the graph wrong, but the decrease comes after years on continuous growth, so total employment numbers in tech should still be absolutely massive, compared to 18 years ago?
If it continues, then yes it could be bad, but so far it seems like a correction for over-hiring in 2021 - 2023. Seems a little weird to be focusing on a decline in 2024 - 2026, without addressing the large increase right in the years before.
There's a lot of dynamics where it's the short-term numbers that matter. If you're a developer who needs a new job after your spouse got transferred to LA or something, it does you no good that the absolute numbers are massive, nor that a different person looking for a job 3 years ago would have found it uncommonly easy.
I had no idea I was in such an exclusive group back in 2000. Everyone I knew was a software engineer or in tech one way or another so I suppose I got a warped sense that I belonged to a larger group.
I'm not sure the nation wide raw statistics are that reliable in the field of software engineering without interpretation.
In the 90s tons of people who were de facto software engineers were listed as "Information Technology Workers". I suspect a lot of that still hasn't been shaken out of the system.
According to the BLS in the year 2000 there were 3.4 million information technology workers.
BLS had some classification changes over the years. I think it's interesting in the "this is how people thought about the role over the decades."
Today there are computer programmers (15-1251), and software developers (15-1252), and web developers (15-1254).
In 2018, there was a reclassification - https://www.dol.gov/sites/dolgov/files/ETA/oflc/Presentation... where 15-1132, Software Developers, Applications and 15-1133, Software Developers, Systems
Software where reclassified into the software developers (15-1252) group.
The other thing that confuses this is that a lot of positions were classified as Computer systems analysts because that's a position that a TN visa can be hired for (there is no software engineer in there... and it wasn't until relatively recently that one could be a "software engineer" in Canada without being an Engineer.
Computer programmers 1010 15-1131
Software developers, applications and systems software 1020 15-1132, 15-1133
Where the "Computer programmer" was the more junior classification and Software developers working on a word processor were classified differently than a software developer working on the operating system... and they were the more senior positions.
Software Developers
Research, design, and develop computer and network software or specialized utility programs. Analyze user needs and develop software solutions, applying principles and techniques of computer science, engineering, and mathematical analysis. Update software or enhance existing software capabilities. May work with computer hardware engineers to integrate hardware and software systems, and develop specifications and performance requirements. May maintain databases within an application area, working individually or coordinating database development as part of a team.
Computer Programmer
Create, modify, and test the code and scripts that allow computer applications to run. Work from specifications drawn up by software and web developers or other individuals. May develop and write computer programs to store, locate, and retrieve specific documents, data, and information.
If anyone saw that LinkedIn post about someone at Block resigning guilt of being offered a raise and retention after the layoffs, I'd say that is a signal that tech is heading down.
Most people would be thankful to have a secure well paying job in the post AI blow off; increasingly it's going to harder to differentiate yourself against anyone else using AI. That we have people still in the thick of AI that don't understand that is a strong signal that AI boom is still going to come take some jobs.
If you're in a software related role and AI isn't making you more productive, it's on YOU as a dev to figure things out quickly.
AI is coming for your job so you can either be an AI manager, or you can get managed out for AI.
caveat: This is my take as someone who used to do a lot of hand coding, and now regularly has a small team of AI doing anything that would have normally required mostly brute coding strength but not too much thought; that's facet'ed plots, refactoring libraries, improving pipeline efficiency, adding parallelization where possible, building presentations, adding test coverage.
Remember it's the first derivative. The title and chart suggested at first that there are fewer tech jobs vs the start, but really it must be way more.
I still kinda want to see this going back to 2000. That must be the biggest tech crash by far. 2008 and 2020 were overall market crashes, but tech was booming.
Something needs to be done about these bots, it is getting eerie. Yesterday a bot created an account named 100xLLM the moment I responded to it to respond back with.
I'm looking for job (in Rust) now and is absurd how many positions are for training LLMS -in Rust!- (yeah, lets help the people that wanna put everyone out of jobs)
Well looking for a “programming job in $x” is going to be a problem going forward. Programming itself is going to be a commodity between AI and its harder to stand out in a saturated market. I sell myself as someone who can get stuff done using technology and can lead larger initiatives
As a hiring manager, I have to write the job description. HR is responsible for posting the damn thing where people can see it, then into the ATS you go. We also know recruiting posts can be a source of competitive intelligence and signal for investors. We don't want it used that way, but we're aware of it. Bit of a dirty secret. That means, alas, the only people hurt are the applicants looking for work. I'll work through the queue when I have a billet to fill, but otherwise... You're shouting into the void. Not sure who is responsible for reporting headcount increases to BLS, but I've actively looked and never found the person. So... I honestly have no idea how they get their numbers unless there is a pipeline from the major payroll processors; which feels kinda ick if you think about it.
https://www.bls.gov/k12/teachers/posters/pdf/how-bls-collect...
This says statistics, i've seen unsourced articles saying that they pull Unemployment Insurance numbers as part of it which are part of the payroll process, but BLS seems to say sampling and surveys.
I had an AI-agenerates answer for you, but then I realized something deeper: moral hazard.
> Moral hazard is when one party takes actions that impose costs on others because they don’t fully bear those costs themselves. With ghost jobs, employers get benefits (brand signaling, resume mining, internal optics) while job seekers eat the time, emotional, and sometimes financial cost of chasing something that never really existed.
There's a dozen different angles all coming out at once. I'll try to summarize some.
- really wants to hire H1B, but needs to pretend to interview first for compliance. These usually have absurd requirements to make it viable to reject anyone.
- really wants to do an internal or referral hire or promotion, but needs to interview for HR compliance. These usually have such specific requirements that only the person they want qualifies.
- posts jobs because a company wants to look like its growing, even when it's not.
- posts jobs to either signal to an employee that they are replaceable, or to try and relieve a stressed employee that more help is coming. Either way, it's a bluff
- yes, sometimes you want to hold out for the perfect unicorn and are not in any way in a rush to find them. There's no distinction for this, but job posts are cheap so why not?
- outdated posts that still stay up because There's no rush to take it down.
- a technique used to lower compensation. They post a job, see how many applications it gets. If it's more than enough, they take it down (with no interviews) then put it up once more at a lower rate. Repeat until not enough people apply. This may or may not lead to interviews because the actual goal is market probing.
-purely to advertise the company instead of actually hire. Usually done at career fairs where you talk and realize there's no actual open positions.
It's probably not the most efficient means, no. Probably one of the cheapest methods, though. It's definitely not something you can get away with in a good job market.
There's a IT careers site that was sold, I believe, went through a re-branding. And now they also offer AI and "personal" resume reviews _and_ writing, cover letters, and they even have members do a 10-15 minute AI virtual interview that ostensibly could be shown to a hiring manager.
I was unemployed as a PM for about three month. I applied to in the order of 100 roles at this site, as well as applications on the other sites you'd expect, from LI to more niche.
I felt that this site was "underperforming". Jobs I'd applied to that I'd only really seen on there I'd never heard from. I saw jobs that were advertised in other places on there too.
What sealed it for me was that towards the end of the three months, I got an email from the site. "Your profile has been viewed". I open it, "An employer is looking at your profile". I'd never seen this type of email from them before, and sure enough: "Your profile has been viewed 1 time in the last 90 days". That was it. No contacts, and only one employer has even looked at my profile on the site (and this is the kind of site where that'd be the only place they could look at your application). And that employer didn't even have positions open.
But the site does ask you questions to "submit to the employer" about "why you want to work here" "why you'd make a good fit", etc.
And I'm entirely convinced that the jobs they're advertising are only (a very small) fractionally "real" and ever reviewed by anyone at all (maybe the "promoted" jobs?), and they're harvesting positions and jobs from other sites or employers (there's no positions that don't actually seem to exist, or at least not ads)...
... and that their chief motivation for this is getting all your answers to train their models for their actual revenue generator - AI resume writing, cover letter writing, etc. All pre-seeded with other people's real answers to such questions.
Two reasons. One, they have already filled it internally but legally have to post the job. Two, they are gathering data on market trends and what salaries people will take, which is useful if they are considering firing people and rehiring with lower salaries.
I've applied for many jobs where I was perfectly qualified and got rejection notices immediately. I applied on a Sunday and got rejected on Sunday an hour later. No human reviewed that application I made, it was auto rejected, and if that's the case, what other explanation is there than "ghost jobs."
> and if that's the case, what other explanation is there than "ghost jobs."
You didn't pass some arbitrary ruleset given to an AI or machine learning algorithm.
Companies can be very selective now, and usually implement this selectivity fairly stupidly. There also is the problem of being genuinely swamped with bullshit applicants for positions, so the false positive rate is likely quite high at the moment.
I've found it extremely difficult to sort the wheat from the chaff right now. Finding competent people is more difficult than ever, but the sheer number of applicants is at least an order of magnitude higher. Botting has made applying to jobs exceedingly low friction, so there is very little downside to someone entirely not qualified to apply to 600 jobs a day and hope they get lucky.
We have positions that have been open for months that go unfilled simply due to lack of time to sort through applicants, and the few we do have time to interview usually are obviously unqualified within the first 5 minutes of talking to them.
I can't imagine applying to a job where I didn't already have some sort of personal connection. That was already true, and that's even more true now. Likewise, these days as a hiring manager I'd be unlikely to hire someone that came in via random application for the same reason
This is undeniably happening as well. Totally agree.
I just have had lots rejections, and some where I did have a good fit, that I don't think "AI auto rejection" is the only story. I have good credentials, several F500 experiences, no big career gaps.
The only real success I have had in the last few years is targeted emails (from who is hiring on HN) or through my network.
It's very different than at any other time and I believe it is a combination of a terrible market, AI rejections, and ghost jobs. And I'm sure there are more than a few ghost jobs.
Oh definitely. And our hiring practices are not exactly state of the art. I'll be the first to admit they need a giant amount of improvement.
Most of the good folks have come in via word of mouth and networks, as they typically do.
For those outstanding positions they are "very nice to haves" but obviously not critical. When the right candidate gets matched we'll jump on the opportunity, but it's not an existential problem for the moment.
This is a culling and the fake it until you make it crowd that focused on surface level only knowledge are finding out why they should have went deeper. The ones that honed their craft and really focused on the foundational and core stuff are in demand.
I am in my 20s. At the moment I've got a part time job, but I am preparing myself for the worse. In the next few years I am planning to volunteer at farms through workaway. Maybe one day I can become self sustainable,tech and nutrition wise
also getting into plumbing, curious to see what others are doing in this regard.
A look at the number of replies to "Who's Hiring?" each month over the past year or so compared to prior years made it loud and clear. Traditional tech has been in a recession for a couple of years, at least!
What that graph looks like to me is that there were a ton of marginal folks hired during COVID, and we're shedding a lot of those folks who probably shouldn't have been hired anyway.
Unfortunately... I want to mirror this sentiment. I interviewed a lot of candidates (and worked with many teammates) in my last few roles and I saw some pretty worrying trends...
Anecdotally seeing this play out in SF right now. We're hiring for our fintech startup and getting 500+ applications per role, many from people at companies I would've assumed were stable. A year ago we struggled to get anyone to even look at us. The talent pool is incredible but man it feels weird benefiting from other people's misfortune.
I wonder how the figures look for countries outside of the United States.
For what its worth, I ended up getting a tech job in Japan instead. Ironically, the requirements at U.S. startups are much higher, and U.S. startups fit the stereotype of Japanese work culture more than Japanese companies nowadays.
A key thing in this graph is that it doesn't seem to correlate to the rise of code assistants. That is a (relatively) recent last year thing from my point of view. Yes, they were there before, but they hadn't really hit in a way that I think hiring decisions had shifted because of them. This is just tech laying off, not AI taking jobs.
Looking at the employment report that came out today, tech seems to be doing better then most sectors...
"Other professional, scientific, and technical services" grew month over month and year over year
"Information" took a hit, but the bulk of that was "Motion picture and sound recording industries"
"Computing infrastructure providers, data processing, web hosting, and related services" modestly shrunk, but "Web search portals, libraries, archives, and other information service" is the only area to grow under information.
This seems different then what the post says. They also said worse then 2008, but didn't post any information. I would imagine the total market was much smaller, so the while total jobs lost was probably smaller, percentage was probably larger. When I started in 2012, tech would take any with a science degree.
I don't understand the job titles being propose in the post, are the using different BLS data then me?
Are we in the Cathedral or the Bazaar now? I get that confused. Everyone upload their code to GitHub --keep your truth (philosophically AND mathematically) in the Cloud ;) Oh and don't forget to document your critical thinking, on Slack. It goes much deeper tho.
Counterpoint: I've been desperately trying to code myself out of a job for almost 4 decades now. I inevitably ended up (and keep ending up) getting more responsibilities instead.
So have all the great engineers I've been working with - there's a deep desire for growth past the things that you're currently good at.
The people worrying they might code themselves out of a job are in a different skill demographic. (Ironically, that means they won't be able to code themselves out of a job)
Are you saying you want to be laid off with a nice package as you’ve been with the same employer for a long time? Couple of options: have a nice conversation with your manager and make this clear. The signs are clear that major s/w layoffs will happen in the next couple of years. Other option is to ease off - your high salary and low output will put you in the dustbin list
> Counterpoint: I've been desperately trying to code myself out of a job for almost 4 decades now. I inevitably ended up (and keep ending up) getting more responsibilities instead.
What exactly do you mean by that? Do you mean you finished one project but your employer had another one for you, which you then were expected to work on instead of sitting idle? Or do you mean you coded yourself into a "promotion"?
My comment was just mocking the foolish selfless ethos of many software engineers, who don't look out for themselves and idealize giving to psychopathic organizations that will screw them the moment that's advantageous. Many software engineers have a pathological level of naivete and confusion about the role they really inhabit (e.g. righteously going on about buggy-whip makers).
I have been getting automated emails from a slew of different recruiters now. Usually one or two per day. I believe they are LLM generated. However, I usually don’t respond.
The next step is for me to respond with an LLM. Maybe if my LLM is good enough it’ll convince their LLM to skip the interview and just offer me a job.
This chart shows that the rate of year-over-year, month-by-month change is worse than 2020.
But the number of tech jobs has grown by 12% since April of 2020 (2.34M vs. 2.63M). Heck, there are more tech jobs today than at the beginning of 2022 (2.61M), even.
Job market sucks, trend is bad, but post title is a misnomer for what this chart shows.
(Numbers based on a quick grab BLS.gov data of CES6054151101 (Custom Computer Programming Services) + CES5051800001 (Computing Infrastructure Providers, Data Processing & Web Hosting) + CES6054151201 (Computer Systems Design Services)---couldn't find other ones quickly and gave up :))
Good to know it’s not just me. Sheesh. Are there signs that it’ll bounce back again?
I’ve been looking for work for nearly seven months. I can write low level systems code in C and C++ to web applications in Python and compilers in Haskell. I have tons of industry experience.
Yet most places I apply to ghost me or follow up a month later that the position has been filled.
Companies that have been lying off people claim they are seeing record profits.
It seems like we went from a relatively stable growth to just chaos.
There's still the hangover from free Covid money. I think the number one reason that it feels worse is that there's a LOT more people in the industry now than back in 2020. Much more competition than before.
Those of us who have been in the industry for 15-20 years we remember a time when tech was just a job.
In the mid 2010s, then most notably in late 2020 - 2021, you had people who had no interest in tech entering the industry because they saw it as an easy career to make decent money in.
It got pretty bad in the late 2010s, but it become almost comical in 2021 with people who took a two-week coding bootcamps suddenly landing 6 figure jobs. Some of these people were even working at multiple companies at the same time.
The optimist in me hopes this all shakes out with those people who had no interest in tech moving on to other things. These types of people were not only bad employees, they were also bad for the industry, and in my opinion responsible for culture shift from tech being a place dominating by "nerds" and "geeks" in the 90/00s to the modern "tech-bro" stereotype.
The realist in me though will continue to warn people the tech job they're working at today is likely their last. Between tech industry growth slowing, the excessive over production of tech talent and AI + SASS automating a lot of traditional software development work it's going to be exponentially harder to remain employed in tech in the coming years.
So much so you might as well find a relatively worse paid job if it means you don't have periods of months of unemployment every year.
>In the mid 2010s, then most notably in late 2020 - 2021, you had people who had no interest in tech entering the industry because they saw it as an easy career to make decent money in.
I remember it was even earlier than that, in the 90s when Bill Gates became the richest man in the world.
You're right about 6 month bootcamps leading to jobs in 2021 though! A true gold rush.
If things continue to get worse I really worry how many people might give up on life entirely. A lot of people in this industry don’t have a whole lot else going on for them, myself included.
I grinded my 20s away trying to have a successful career and if that just gets pulled out from under me I’ve got absolutely nothing.
I know a very senior engineer who took their life the day after Trump was elected. He was unemployed for a while.
While I think a lot more was going on with him than being unemployed, I'm convinced AI hitting the scene had a bit to do with it. They were an older dev 50+.
In fact in sectors like the game industry the pandemic resulted in a massive hiring boom. The layoffs only materialized after the pandemic was well and truly over.
Yeah, pandemic was good overall. Next generation of consoles looming, sales overall were up since people were forced inside, there were finally some loosening of dev kit practices to accommodate for the lack of offices to go into (I never would have imagined in 2015 having a dev kit in my home 5 years later) .Pretty much the only art medium to benefit from the times while cinema collapsed, Streaming services were running a defecit war where no one won except Netflix, and music stalled for a bit.
But as per usual, the bust hit just as hard as the boom. Multiple high profile failures in games and initiatives as a whole, Microsoft and Apple decided to stop bleeding money with their respective subscription deals, mobile gaming (from the advent of Genshin Imapct and co) became less an easy cash grab and more a 2nd wing of AAA development, investments dried up overnight for indies (unless 'AI').
on a personal level I was hired (by Microsoft, my first and only "big tech" job) in April 2020 and I am still working here... all these companies "over-hired" during the pandemic, and the term "covid hire" is even a thing..
I remember interviewing someone who got hired by Facebook, sat around for a few weeks for a team to open up while they went through onboarding / Junior training, then was let go.
COVID did weird things to the industry, that's for sure.
Before Musk made it cool to mass layoff, there was a genuine belief inside of Facebook/Meta that great engineers were extremely hard to find or hold onto and if they weren't on the payroll at Meta, they would go somewhere else.
There was always a "clock" for junior engineers to prove they could handle the high pressure and high intensity work, and as long as they were meeting the bar, they were safe.
They called on-boarding, "Bootcamp", and was for every engineer, junior to staff, to learn the process and tooling. Engineers were supposed to be empowered to take on whatever task they wanted, without pre-existing team boundaries if it meant they were able to prove their contributions genuinely improved the product in meaningful ways. So, come in, learn the culture, learn the tooling, meet others, and then at some point, pick your home team. Your home team was flexible, and you were able to spend weeks deciding, and even if you selected one, you could always change, no pressure. Happy engineers were seen as the secret sauce of the company's success.
I remember that summer, vividly. They told the folks in Bootcamp, pick your home team by the end of the week, or you will be stuck in Bootcamp purgatory. At the same time they removed head count from teams, ours went down to a single one. A new-grad, who had literally just arrived that Monday, picked our team on Tuesday, and then had to watch as most of their fellow Bootcamp mates got left behind.
People wondered what would happen to them for weeks, and then, just like that, the massive layoff sent them all home. It was shitty because from where I sat, it was basically a slot machine. Anyone of the folks in Bootcamp were just as capable, but we had one seat, and someone just asked for it first.
I seem to hear often that Meta is perhaps the most egregious offender of "hire to fire". Seems really wasteful. But man, they pay their employees a lot.
Overhiring implies that MSFT's headcount went down over this time. But that doesn't seem to be the case. They still hire a lot, just not in North America.
I know people will say AI, but I don't think it's that. The whole "everyone should learn to code" bullshit of the last 15 years or so has created a lot of developers that frankly aren't very good, and then you mix in the massive overhiring of the pandemic, and what you're seeing is a hard correction. CEOs love to use "productivity improvements from AI" as a smokescreen and investor catnip but the research shows it's not having the effects claimed.
Blaming the employees is BS. A pretty large % of the people losing their jobs are also high performing excellent people. I feel like anyone who worked at one of these companies doing lay offs knows this.
Sorry, I didn't mean to imply that the people that got laid off were the ones that deserved it. Obviously politics, situational factors (wanting people back in the office), etc are a huge part (I myself am recently laid off and I don't think I deserved it!) What I was trying to drive at is developers have lost a lot of leverage and negotiation because there are too many people fighting for too few roles. Also I can't help but wonder if people have lost trust in developers because of the dilution of talent.
That being said, I don't think it's unfair to point out that creating a massive influx of new developers without jobs that provided good mentorship (most jobs are awful at mentoring junior developers) is going to have huge consequences that we're now dealing with. I think the "learn to code" thing was a massive mistake. Encourage the people that want to, sure, but don't try to pull people in that are only marginally interested in a paycheck.
The layoffs are not necessarily executed in a way that takes performance into account, but that doesn’t mean that the industry overall doesn’t have too many people for the amount of work that needs to be done.
Its only the part about casting any aspersions at the people laid off for being low performance that bothers me because I know so many incredible people for whom they absolutely did not deserve it and its not fair to assume anything about their value or quality of their work specifically.
> that doesn’t mean that the industry overall doesn’t have too many people for the amount of work that needs to be done.
Not that I disagree with you here, but it is hard to square this with people who are also saying not to worry about AI displacement because there's limitless demand for software.
Depends on the industry and product, as usual. On an large level, I do not think there's "too many engineers and not enough problems to be solved". Companies are simply hunkering down for a recession we can't say out loud.
Also wtf were we supposed to do? I graduated during the great recession. No one was hiring. Everyone from the president on down told us to learn to code. So we did.
I don't recall any recession with respect to tech in 2020. It was a hiring frenzy.
Any commentary about tech jobs that does not include the interest rate environment and the massive over hiring that occurred between 2019 and 2022 is borderline dishonest.
Look at federal data of SWE job postings and look at the federal funds rate for the same period. Jobs is giant mountain peaking in ‘22. Interest rate is zero for the pandemic and then spikes right when SWE jobs start to collapse.
Tech hiring is all downstream of interest rates. AI has had almost no impact, at least not yet. (Block layoffs were not AI, look at their stock, they basically can only succeed as a financial company when money is free, very misleading and a convenient excuse for terrible management to now say they need to be “AI native”)
My friend who has worked for some big name companies is absolutely struggling to find work. So many interviews, 3rd, 4th round, and then they go with another candidate, or I can only assume someone internally and the listing never existed. It's killing me watching him struggle to find work.
What happens when AI gets so good that even "Juniors" can compete with "Seniors"? It's going to happen at SOME point. I think what will separate devs will be their creativity and ideas. Those that can think outside the box will be the ones getting hired. With that said, I sometimes feel like eventually the OFFICE MANAGER will be coding everything for their company lol.
Software will become cheaper to write and more software will be written. A senior will always outcompete a junior when it comes to logical thinking and a true understanding of IT. A programmer will always outcompete a non-programmer when it comes to using AI tools.
Bottom line, if programmers are fucked, so is just about anyone else.
The graph does a really poor job supporting the conclusion, most obviously because it only goes back to 2016, the peak of boom times, it doesn't go anywhere near 2008 so why does the caption talk about that? Just this same graph alone going back to 1990 would be super eye-opening.
The other thing is it's showing first derivative, not absolute numbers, which is a very questionable way to derive "worst employment situation" in a field that has been on world-changing boom over the last 50 years.
The question that I have for this data though is that its showing the derivative - the change each year in hiring.
The dot com crash is clear and very visible in there. The global financial crisis is also a dip in there (I'm saving this for when people claim the number of jobs lost compared to the dot com crash).
From 2010 to 2020, there was a fairly steady linear growth of employment. There was the dip in 2020, but 2020 to 2024 had a much higher peak. My "I want to know about the data" is "is the area above +150k jobs from 2020 to 2024 greater than the area below 0 from 2024 to 2026?"
Although the graph lists BLS data as the source, it's hard for me to find the specific datasets that back it up. It's March 2026 and the graph indicates it would encapsulate 2025. In fact, the "Software Development Job Postings on Indeed in the United States" indicate something different,
https://fred.stlouisfed.org/graph/?g=1T60O
- Regional data available only, numerous national statistics are discontinued
- California region matches up, but places like Boston don't https://fred.stlouisfed.org/series/SMU25000005051320001
- Doesn't include January or February 2026 data, doesn't match up with graph in tweet
I wasn't able to find the following:
- Custom Computer Programming Services
There are numerous open questions in this analysis which I would need to be addressed before drawing any conclusions. My gut feeling would love to accept it at face value but I never trust my gut.
2008 wasn't that bad for tech and neither was 2020. 2020 in fact lead to the highest employment craziness in the tech field ever. It's no wonder that the entire industry has indigestion from so many workers hired.
Purely anecdotal, but I'm a senior engineer with about 15 years of experience and a decently impressive resume. In the past, I almost always get to at least the interviewing stage, and have frequently received multiple offers at the same time. Recruiters used to spam me constantly.
I haven't heard from a recruiter in probably 6 months. I recently put my feelers out and applied to a handful of positions I was qualified for, and got rejection letters from all of them.
Try only including your last 2 positions, as most senior staff tend to scare insecure brogrammers. They probably still won't hire, but you will get more interviews.
There are also a lot of people posting fake jobs for feeding LLM datasets, running scams, and bidding down labor costs. =3
As someone who has been hiring the past six months, the candidate pool has been absolutely abysmal. Slop resumes, misrepresentations, bad interviewing skills, etc. I'm baffled at the other end at how much inbound crap there is.
After reading many anecdotes of top school alumni struggling to even secure some interviews, I'm really curious about what opportunities available for median American freshgrads e.g. 3.0 GPA from T100-200 Unis.
This is the plot of the first derivative of employment. It shows a comparatively small but lasting dip after a massive, prolonged, and unwarranted boom between late 2020 and early 2024 that coincided with the dying breaths of ZIRP.
I have no idea about what's coming, but I wouldn't pay a whole lot of attention to people who are looking at the plots of a highly volatile and cyclic industry that goes through constant boom-and-bust cycles, and are trying to position this as proof that AI is or isn't having an impact.
This industry is a race to the bottom and long overdue for a massive salary reset. There is no society on Earth where someone who codes JavaScript (poorly) should make more money than, say, a doctor. Yet here we are.
A lot of doctors are just pill prescribers. I can't tell you how many times I've been to useless doctors that don't help and give you a big fat bill on the way out the door.
Yet they're all expendable because their chosen skills are not essential for society to survive. They could all disappear tomorrow and the web and everything around it would move on.
Good. Too many useless people got into tech because any monkey can memorize LC and outperform the monkey who also memorized LC and now is part of the interview panel so they can show off to the other monkeys that he deserves more bananas.
Tech was and still is the easiest way to make 200k base salary, before even thinking about the stock.
We need a reset and anyone who can’t make it can go fill the jobs we need in construction, education, etc.
Anyone else's inbox slammed with recruiters, more than it ever has been in the past? Feels like there's 10x the jobs available, but perhaps it's just that LLMs have automated a recruiter's job and they're letting the slop fly
What mediums are you using for recruiters to contact you? Do you have a linked-in or are you applying directly to recruiting companies? Are you active anywhere else?
Genuinely interested in how you're receiving so many recruitment emails. That used to be my go to way to hit the job market.
I'm not applying at all! Happy employed and not looking.
They're mailing me through linkedIn (i have a profile, and it's not set to looking and i'm completely inactive there), and or finding my email on the internet somehow and going direct.
I mean, didn't a ton of people get hired immediately after 2020 during the pandemic? Also,I don't remember the tech sector getting hit too hard during 2008 time period, it was mostly everyone else.
There's another post for it that went back to the early 90s. I've grabbed it at https://imgur.com/a/kB9CAKF though it's a smaller image (Imgur resizing).
It's bad, yeah, especially for folks on the job market (it me). Some statistics first, from my own job search logs:
* Since I hit the pavement in late January, I've tracked 100 job applications
* Of those 100, only 7 have turned into interviews
* Of those seven interviews, 3 turned into second-round
* ~50% of all applications never receive a response
* ~20% of rejections for any reason have the role re-posted within thirty days
* For rejections stating "higher quality applications", that role re-post rate is closer to 50%, suggesting ATS systems culling too many candidates to fill the role or ghost jobs
* Despite my state requiring salary requirements be posted in the JD, only around 70% of postings included what could be considered "reasonable" estimates
* 100% of interviews have been for local employers requiring 3+ days on-site
And now, some observations not captured in the data directly:
* Employers are trying to "under-title" folks; Senior roles want to hire former Leads, and Management roles want next-rung candidates for prior-rung titles (e.g., hiring what should be a Senior Manager for an entry-level management role)
* Employers are also trying to underpay workers by a large margin, especially folks coming from Big Tech ("We don't pay {SV_FIRM} money" while offering salaries below the local 50%ile for the role in question); they're blaming a "surplus of tech talent", which may or may not be true (I lack the data to prove either way)
* The two above points are in conflict, because rent/mortgages in these areas are so steep that even with major lifestyle changes to cut costs, these wages simply aren't survivable for local areas
* "Credential Creep" is back in force: Architect certs required for mid-level engineering roles, buzzwords prioritized over outcomes and achievements, and AI ATS' rejecting qualified candidates flat-out
* College Degrees are relevant again as a means of pruning candidates; fifteen years of experience is irrelevant for a lot of Senior roles if you don't have a BS or Masters, which wasn't the case even last year
* Industry-specialization is also back, even for roles where industry specialization is generally moot or easily picked up (e.g., Corporate IT stuff)
* A significant number (~75-85%) of roles explicitly reject H1B and other visa workers; not a problem for me (Citizen), but this is the worst possible time to be job hunting on a non-LPR status.
And now, my personal experiences:
* There's a very strong attitude of "you're being entitled" when it comes down to salary negotiations, even when you show your math for essentials - and share prior compensation history reflecting the cuts you've already taken since your Big Tech salary to "rejoin the market".
* Employers generally have no clue how expensive it is to live right now, especially in major metros; one such employer who balked at my comp floor genuinely had no clue the median rent was three and a half grand per month.
* Compensation seems particularly tilted towards working couples; as in, neither alone makes enough to survive, and employers assume you have a FTE spouse to shore up finances so they can pay you less
* Employers also don't seem to know what they actually want or need. Specialist Engineer roles (e.g., Cloud Engineer, Network Engineer) cite required experience and expertise with the full technology stack inclusive of ERP and HRIS nowadays, which is something that used to be handled by a specific team for the entirety of my career thus far, even in smaller (<1k) orgs. I've also seen Architect roles demanding Help Desk work, and Software Dev roles who want experience supporting Entra.
* AI does not feature in as many interviews as I would've thought. The few times it does, it's very much a "that's nice, but we're taking a wait and see approach" attitude
* There's a lot of eagerness to hire domestically again (I think even middle managers were tired of outsourcing or offshoring), but a lack of budget to afford domestic talent.
Ultimately, it's pretty bleak - but still better than last year, at least thus far (~300 apps, ~2 companies interviewed with, 1 offer in 2025). AI isn't the value-add I was sold on by career counselors and LinkedIn (huge surprise there /s), and there definitely seems to be the appetite to hire, but not the realism of what to expect or how much it'll cost. I very much view it as a sort of tug-of-war at the moment, between workers who did everything expected of them and have cut to the bone already, and employers who somehow think they can pay <50%ile wages while mandating 4-days on-site in a major metro for experienced talent.
If you're an employer looking to hire, I have some advice:
* Ditch the AI ATS or AI summaries and read resumes, especially if you're requiring local presence.
* Understand what you need (and what that will cost you) before posting the JD
* Understand the local cost of living, and budget accordingly (i.e., if your Senior Engineer can't afford median rent, they're not going to stick around when things improve)
* If you value loyalty and aren't paying TC to afford a median home in the area, then you don't actually value loyalty
* Don't pigeonhole yourself with hyper-specific candidates as a means of winnowing down applicants; that level of specialization will flee the second they get a better offer elsewhere
* Post salaries in the JD, required or not, so you don't waste your time with candidates whose expectations don't align with your budget
I'm early in my search, but compensation seems all over the place. From the recruiters I've talked to they've been pretty baffled too about compensation given the market. I know one place a recruiter called me about was looking for absolute unicorn talent (like 15 years of experience in multiple very different domains), but their salary was like 70k less than I made at my previous job and when I asked about titles they said it was "flat" and everyone was just a "software dev".
I don't want to sound like it's all a horror show though, I've had some interviews that have gone well with companies being sensible, so I think there's good stuff out there. But it's overall a rough market.
I'm also looking right now and a lot of that resonates with me. The posted salary ranges are often a complete joke as you noted. "The pay band for this role is $80,000-250,000 commensurate with experience and interview performance". Yeah OK buddy are you seriously trying to tell me you have multiple people with the exact same job title making salaries over $100k apart? Feels like they're just giving the finger to lawmakers through malicious compliance.
I've also run into the industry specialization roadblock a few times. Got turned down by a fintech company after multiple interview rounds because I did not have banking industry experience, for example. I guess I get it as a tie breaker but I've operated in a PCI compliant environment for years, seems like that should count as relevant experience? Also if you're going to dumpster candidates without banking experience why on earth did you waste several hours of your staff's time giving me tech screens?
Job hunting has always sucked. But it feels particularly busted at the moment. The process is miserable. If you've coasted to an easy hiring in the last year, you're either amazing (and hats off to you!) or got very lucky.
The salary ranges are complete jokes on either end: they're either malicious compliance like you pointed out, or completely out of touch with reality.
My example of that was when I applied for an Architect role (as I'm at that point in my upward career trajectory), and they asked me instead to apply for a Senior Admin role as they "didn't know what the Architect role would look like yet". I did, I included my comp target, and got the hard sell on why I was being unreasonable and should take {2016_PAY}/{$100k below SV_FIRM} instead. I mentioned my absolute floor was {$75k lower than SV_FIRM}/{$25k lower than my target}, ran him through my math (median rent for the area, on-site expectations, commute costs, food costs, insurance costs, 50/30/20 budgeting, etc), and pointed out that floor would only cover needs (50) and savings (20) with no fun money (30) whatsoever. Ultimately I withdrew my name entirely because the guy just wouldn't listen to me, and all but demanded I be grateful for his number in the current economy.
I suspect something similar is going on with another company that's seemingly ghosted me, after I stated I was targeting their upper boundary of their listed comp range - still $85k below {SV_FIRM}, but with growth potential towards Architect and Director-type IT roles. Even when I'm fine eating huge pay cuts for work (and falling off the homebuying ladder, as not even {SV_FIRM} paid house-purchasing money), the employers out there really do want perfect diamonds for the cost of Halloween Trinkets.
> Also if you're going to dumpster candidates without banking experience why on earth did you waste several hours of your staff's time giving me tech screens?
This is also something that's grinding my gears. Had an investment firm put me through six technical interviews with glowing recommendations every step of the way only for the seventh round (CIO) to put the kibosh on it without a reason and after showing up unprepared and disinterested. Also had companies say I lack financial discipline experience when I've literally built models, showback systems, budget forecasts, and cemented six-figures of monthly savings in prior roles; same with companies saying I "lack compliance experience" despite calling out running infra in highly regulated environments, performing compliance audits for clients, and uplifting infra to satisfy compliance regimes.
If I didn't know better, I'd say the entire HR process is just feeding shit into chatbots and letting them make hiring decisions. Nobody seems to actually care about the humans involved or the wider systems at play.
It's immensely frustrating, but I can only keep on keeping on until something changes. I don't need to win every application, I just need to win one.
What are people's plan Bs for when it goes tits up? I reckon I'd make a good electrician or something like that, but it's a real, grown up profession and I would have to get qualified.
We could arrive at the technical singularity and come up with 8000 IQ robots that can do things in a clean room but in the messy physical reality? I believe they will fail to catch up forever.
They will fail to deal with a stripped bolt head deep inside an engine bay that's been exposed to 40 years of road salt, that needs to be hit right with a 10lb hammer and a home made chisel until shit knocks loose, combined with cutting, welding, drilling, torching, tapping, impromptu redneck engineering, cursing, the use of 8 different kinds of penetrating lubricants, the acquisition of weird and highly model-year specific parts in a junkyard 500 miles away, realizing it's all wrong and doing it again.
Multiply the complexity by 100 times and that's what it's like to take on a classic car project.
Games are a weird sector. Long term I simply want to go it indie. Even if I doesn't pan out, it's some kind of passive income and I have something to show to that is fully "mine" (so no ambiguity about how much I really contributed at BigCo.).
Short term I'm freelancing and doing whatever else I can find to get by. Hoping for one more full time role before I start my self published ventures.
I'm an outsider, but VC looks like it's in a bad place right now. The returns have been pretty bad this decade, and AI companies are eating all the dollars so if you're not doing that it strikes me it'd be hard to get funding. And idk, if you already have an AI startup then you're hoping for the best, but if you were starting a business today I think that bubble looks like it has a lot of needles around it
As I previously mentioned, based on person experience assumptions around hiring have changed due to the Twitter layoffs, demands for FCF positivity, and WFH inadvertently justifying offshoring [0], not necessarily due to interest rate changes.
---
As I also mentioned, the only way you can survive in American tech at this point is to:
1. Move to a Tier 1 tech hub like the Bay and NYC. If you get laid off, you will probably find another job in a couple of weeks due to the density of employers. Seattle used to be a good option, but WA's norms around noncompete clauses incentivize larger employers which reduces the ability for startups to truly scale.
2. Start coming into the office 2-3 days a week. It's harder to layoff someone you have had beers or coffee with. Worst case, they can refer you to their friends companies if you get laid off
3. Upskill technically. Learn the fundamentals of AI/ML and MLOPs. Agents are basically a semi-nondeterministic SaaS. Understanding how AI/ML works and understanding their benefits and pitfalls make you a much more valuable hire.
4. Upskill professionally. We're not hiring code monkeys for $200K-400K TC. We want Engineers who can communicate business problems into technical requirements. This means also understanding the industry your company is in, how to manage up to leadership, and what are the revenue drivers and cost centers of your employer. Learn how to make a business case for technical issues. If you cannot communicate why refactoring your codebase from Python to Golang would positively impact topline metrics, no one will prioritize it.
5. Live lean, save for a rainy day, and keep your family and friends close. If you're not in a financial position to say "f##k you" you will get f##ked, and strong relationships help you build the support system you need for independence.
The reality is the current set of layoffs and work stresses were the norm in the tech industry until 2015-22. We live in a competitive world and complaining on HN does nothing to help your material condition.
I don’t believe this is true. There are plenty of roles that are happy to hire remotely. Sure, there is an in person requirement for many job listings but Ive found EMs/companies to be very flexible if they need to hire talent.
For people that can’t/dont want to move to the “hubs”, just know that there is absolutely still a career path. I will say though that you need to have above average communication skills and proactively build relationships during in person off-sites.
There absolutely are remote first roles in the US, but the competition is also extremely intense. The median SWE and HNer wouldn't make the cut.
It also requires a level of maturity, clear thinking, self-starterness, and independence that is hard to come by without a proven track record and experience.
My advice is for the median/average SWE and HNer, not for the truly exceptional.
Spending a 7-10 years in a hub and then going remote first is the best path because you build the network you need to get referrals to vouch for you as a remote-first hire well as the track record needed to go remote-first.
7-10 years is too much. 2-4 is around the range I would give.
Its also nothing new; new grads gravitated towards these hubs anyway. Previously, they would settle down in the burbs. Now they're migrating anywhere in the US.
I don't know if switching from a extreme partisan site to another extreme partisan site is the right solution just because of the need of login credentials. In any case, Twitter/X has been online almost 20 years. Bluesky still has to prove its staying power long term.
The justification of better UX seems reasonable regardless of politics. I'd prefer not to have to log in on any HN link, and I also can't say I want to optimize for link health as this is super topical and I will never want to look at it again after this discussion is over.
1. You start off by labelling both platforms as "extreme partisan" - care to explain?
2. This charge is used to minimize the original complaint (login requirement), which is a hard blocker to view replies, i.e. additional context.
3. This all then somehow morphs into a point about platform longevity?
How exactly does any of this address parent commenter's statement that "bsky is just a superior viewing experience."?
Is this relevant to the question of whether we should replace the link? Seems like we're going to spend a lot of time running down the views of the UBO of every domain posted here.
I get rejection for every single position that I apply to. Often I read literally a description of myself in job posting, and it's still "we decided to no move forward".
At the same time, I hire people and see that 8/10 candidates are just trash. Not in the sense they "are not aligned", or "emit wrong vibes", or other bs. They literally can't write a single line of code, on their own laptop, in their own IDE.
Because the hoards of people who can't find the ` or | or $ on their keyboard outnumber competent people 100:1. I had this exact experience too, so frustrating. Moved to a strictly referral model where I pay my SWEs $10k if a candidate they refer gets hired.
A person who only has real industry experience can very easily have never needed git at all. I know this shocks people who only have hobby or startup experience but git works very poorly at large scale and there are many big organizations who don't use it either because their solutions predate git, or they are newer companies that simply have good taste.
Contrary to what many say, I don't think it's simple as seniors are getting hired and juniors aren't. Juniors are still getting hired because they're still way cheaper and they're just as capable as using AI as anyone. The people getting pushed out are the intermediates and seniors who aren't high performers.
Personally I think it's a bit more nuanced than senior vs junior (though it is very hard for juniors right now). What I've seen a lot of hunger for is people with a track record of getting their hands dirty and getting things solved. I'm very much a "builder" type dev that has more fun going from 0-v1 than maintaining and expanding scalable, large systems.
From the early start of the last tech boom through the post-pandemic hiring craze I increasingly saw demand for people who where in the latter category and fit nicely in a box. The ability to "do what you must to get this shipped" was less in demand. People cared much more about leetcode performance than an impressive portfolio.
Now reminds me a lot of 2008 in terms of the job market and what companies are looking for. 2008-2012 a strong portfolio of projects was the signal most people looked for. Back then being an OSS dev was a big plus (I found it not infrequently to be a liability in the last decade, better to study leetcode than actually build something).
Honestly, a lot of senior devs lose this ability over time. They get comfortable with the idea that as a very senior hire you don't have to do all that annoying stuff anymore. But the teams I see hiring are really focused on staying lean and getting engineers how are comfortable wearing multiple hats and working hard to get things shipped.
Maintaining and expanding is more challenging, which is why I’ve grown to prefer that. Greenfield and then leaving is too easy, you don’t learn the actually valuable lessons. As experience shows that projects won’t stay in the nice greenfield world, building them can feel like doing illusory work — you know the real challenges are yet to come.
Nearly all of the teams I've joined had problems they didn't know how to solve and often had no previously established solution. My last gig involved exploring some niche research problems in LLM space and leveraging the results to get our first round of funding closed, this involved learning new code bases, understanding the research papers, and communicating the findings publicly in an engaging way (all to typical startup style deadlines).
I agree with your remarks around "greenfield" if it just involves setting up a CRUD webapp, but there is a wide space of genuinely tricky problems to solve out there. I recall a maintainer style coworker of mine, who describe himself similar to what you are describing, telling me he was terrified of the type of work I had to do because when you started you didn't even know if there was a solution.
I have equal respect for people such as myself and for people that you describe, but I wouldn't say it is more challenging, just a different kind of challenge. And I do find the claim "you don't learn the actually valuable lessons" to be wildly against my experience. I would say most of my deep mathematical knowledge comes from having to really learn it to solve these problems, and more often than not I've had to pick up on an adjacent, but entirely different field to get things done.
"when you started you didn't even know if there was a solution."
Regardless what the problem is - as long as I know _nobody knows if there is a solution_ it's an instant sugar rush.
You are free to bang your head against a stone wall for months trying to crack the damn thing.
OFC you need to deliver in the end. And this requires then ruthless "worse is better" mentality - ship something - anything. Preferably a the smallest package that can act as mvp that you know you can extend _if this is the thing_ what people want.
Because that's the other side of the coin - if the solution is not known - people are also not aware if the solution has true value or if it is a guess.
So in any case you have to rush to the mvp.
Such joy!
Of course the mvp must land, and must be extensible.
But these type of MVP:s are not a slam dunk.
The combined requirement of a) must ship within limited time b) nobody knows what _does_ require a certain mindset.
I've found new hires to be more successful when they join, get some easy wins, and then find their own problems to solve. But maybe it's just an artifact of working at large companies where most of the day-to-day stuff is figured out.
(d) although the initial statement seems credible, the problem is actually ill defined and under specified and therefore not solvable as originally stated.
Example: our start-up plans to "fix health care"
Definetly it's a trap. If you are a purist it's nigh impossible. But if you ruthlessly 80/20 it most stakeholders will be pleasantly surprised.
I have no clue why I end up in these situations but I sure do like them.
I do realize this would sound more of a perpetual "not invented here syndrome" but technical implementation of modeling aspects for 3D and computational geometry is such a scarce talent you actually get to do novel stuff for your business.
The last time this happened I designed & implemented the core modeling architecture and led the implementation effort for our new map feature[0]
[0] See section "Stunning new building facades add practical value" in https://www.mapbox.com/blog/detailed-architecture-and-new-de...
It's kind of like when the FAA does crash investigation -- a stunning amount of engineering and process insights have been generated by such work to the benefit of all of us.
Trust me, you get plenty of experience in this as a founding engineer in a startup.
Many of these comments make me wonder how many people here have actually worked at an early stage startup in a lead role. You learn a lot about what's maintainable and scalable, what breaks and what doesn't, in the process rapidly iterating on a product to find your market.
(For readers, I don't think there's anything wrong with that but it just means that certain perspectives are overrepresented here that may not be more reflective of the broader industry.)
The idea that this is means "you don’t learn the actually valuable lessons" is completely baffling to me.
Most people I've know with founding engineer experience or similar leave not because it's not challenging, but because it's exhausting.
Increasingly I've realized that the HN community and I are not even speaking the same language.
Even in areas where startups aren't literally creating new product categories like the foundational model providers, the edge of a startup over a more established business is the speed at which they can provide value. What's the point of buying CoolCo when you can go with L&M Inc. that has thousands of headcount working on your feature. The value prop of CoolCo is that CoolCo can roll out a feature in the time it takes L&M to make a detailed specification and a quarterly planning doc breaking down the roadmap and the order of feature implementation.
Now be part of the team of folks that keeps that application running for 10, 20, 30 years. Now be part of the transition team to the new app with the old data. Those tasks will also teach you a lot about system stability, longevity, and portability... lessons that can only be learned with more time than a startup has.
Takeoff systems aren't analogous to prototype development. I don't know you'd build a prototype plane that's feasible to take to market, without having deep knowledge about how planes are built.
Early design decisions matter. And you don't get to that realisation without dealing with legacy systems where some upstart made terrible decisions that you're now responsible for.
“Technologist flavor of NTSB investigator.”
One of the guys had a very strong opinion that the ideal architecture was something as abstracted and object oriented as possible with single function classes, etc. I let him run with that. The other guy got frustrated with his sub-team's inability to write code to spec in a language they'd never used before and where they were trying to build some new features they didn't clearly understand. He developed a strong feeling that TDD was the most efficient path forward: he owns the PRD and design, so he just created test stubs and told the remote team to "just write code that passes the test" even if they didn't understand the function of the block.
So, after a few months where did we end up:
1. The "abstract everything" architect's team had an extremely fragile and impossible to maintain codebase because it was impossible for any outsider to tell what was going on.
2. The "just pass the damn tests" guy had a team that had quickly ramped on a new language and they had a codebase that was incomplete (because they were building it like a Lego project) but that everyone could understand because the code blocks generally stood on their own.
What was the next step: to shut down the guy who abstracted everything and force him to drive a quick & dirty rewrite that would be more maintainable, and to also start a major refactoring of the "Lego" team's code because it was so fragmented that it also was fragile and unsuited for production.
I saw this as a terrific learning experience for all involved and I was able to get away with it because the stakes were pretty low and we had time to experiment (part of the ultimate objective was upskilling the team), but the more important lessons were these:
1. Docs matter. Take the time to write clear & detailed specs first because you'll be forced to think of edge cases and functionality that you didn't originally, and it provides a basis for system design, too.
2. Architecture & design matter. Adhering too close to any single paradigm is probably a mistake, but it takes experience on the team to understand where the compromises are and make the best decision for that system.
That second point will not stop being true with the advent of agentic assisted software development. Like others have said, my expectation in the job market is that pay will continue to be depressed for junior hires as employers reset expectations and generally just want folks who can instruct fleets of agents to do the actual coding. Senior staff will become increasingly critical and their jobs will be painful and difficult, because it'll be assumed they can (and will be willing to) do design & code reviews of artifacts originated by agents.
What I am going to be most interested in is what happens in the SRE/Sysadmin world over the next few years as more AI-generated code hits prod in organizations that don't have adequate review & oversight functions.
You kindof answered the question yourself. Humans write the tests and then go tell the AI to write the solution which passes the test.
Maybe you're just a really really good engineer and product thinking hybrid!
You learn a ton of valuable lessons going from 0 to v1. And a ton of value is created. I guess I'm unclear how you're defining "actually valuable" here.
This is evident in my personal experience by the fact that I am often the one that sees scaling and maintenance issues long before they happen. But of course parent would claim this is impossible.
Edit: a legacy vibe coder
If v1 is successful and attracts a lot of users, it will have to have features added and maintained.
Doing that in ways that does not produce "legacy code" that will have to be thrown away and rewritten in a few years is a very different skill than getting v1 up and running, and can easily be what decides if you have a successful business or not.
The original punchline ("you don’t learn the actually valuable lessons.") was just a bit too sharp, so you even edited in a psuedo-clarification which actually just repeats that punchline but in a softer way, masterful!
Almost invariably after submitting, I see how I could clarify and/or expand on my thoughts, so I often do end up editing.
In my experience separating the roles out is silly if you're an engineer yourself. We do this a lot and that leads to silly mentalities. Greenfield developer vs maintenance engineer, MVP engineer vs Big Tech dev, FOSS hacker vs FOSS maintainer. Each of those dichotomies speaks to cultural differences that we humans amplify for no reason.
In truth the profession needs both and an engineer that can do both is the most effective. The sharpest engineers I've worked with over the years can hack new, greenfield stuff and then go on to maintaining huge projects. Hell Linus Torvalds started out by creating Linux from scratch and now he's a steward of the kernel rather than an author!
One of the tricks of HN is the 'delay' setting. https://news.ycombinator.com/item?id=231024
> There's a new field in your profile called delay. It's the time delay in minutes between when you create a comment and when it becomes visible to other people. I added this so that when there are rss feeds for comments, users can, if they want, have some time to edit them before they go out in the feed. Many users edit comments after posting them, so it would be bad if the first draft always got shipped.
I've got mine set to 2. It gives me a little bit of time for the "oh no, I need to fix things" or "I didn't mean to say that" and when everyone else can see it.
AI makes it look like these developers can do the same job the Americans did building the product to begin with. Even if things fall apart in the end, it won’t stop the attempt to order of magnitude reduce the cost for maintenance.
Staff or principals that have a tenure of majority greenfield development are extremely dangerous to companies IMO. Especially if they get hired in a nontraditional tech company, like utilities, banking, or insurance.
And if your entire career is nothing maintenance and sustaining projects, you'll never know what decisions it takes to build a greenfield application that lives long enough to become a graybeard.
You'll think you do because you see all the mistakes they made, but you'll only have cynical reasons for why those mistakes get made like "they don't care, they just make a mess and move on to the next job" or "they don't bother learning the tools/craft deeply enough moving, it's all speed for them".
-
To indulge myself in the niceness a bit: I don't think you write comments like the one above if you've done both, yet having done both feels like an obvious requirement to be a well-rounded Staff/Principal.
Most maintenance work suffers because of decisions made at the 0 to 1 stage. And most greenfield work fails entirely, never maturing to the maintenance stage.
So both sides have to do something right in the face of challenges unique to their side. And having familiarity with both is extremely valuable for technical leadership.
When working at larger orgs on legacy projects (which I have also done) you think "what sort of idiot did this?"
Then when you're the one tasked with getting a project shipped in two weeks that most reasonable engineers would argue needs two months, you start have to make strategic decisions at 2am about what maintainability issues will block the growth of the product on the way to funding and what ones can be fixed before 5pm by someone that will think you're an idiot in 3 years.
How times have changed
But to reword it: if you think the reason 0 to 1 work is typically a duct-taped mess is because of a lack of experience or understanding from greenfield devs, you'll probably fail at 0 to 1 work yourself.
Not that a noob developer great at selling has never landed 0 to 1 work, crapped out a half working mess and left with a newly padded resume... but maintenance work is missing out on by far the most volatile and unpredictable stage of a software project, with its own hard lessons.
The duct-taped nature of 0 to 1 work is usually a result of the intersection of fickle humans and software engineering, not a lack of knowledge.
-
People in maintenance can do things like write new tests against the production system to ensure behavior stays the same... what happens when 1 quarter into a 2 quarter project it turns out some "stakeholder" wasn't informed and wants to make changes to the core business logic that break half the invariants you designed the system around. And then after that it turns out you can't do that, legal pushed back. And then a few weeks later they came to an agreement so now we want a bit of A and B?
Or you're in consumer and there's a new "must have" feature for the space? Maybe you'd like to dismiss as "trend chasing", but that'll just doom your project in the market because it turns out following trends is a requirement for people to look at everything else you've built
Or worst of all, you know that quality engineering of the system will take 8 weeks, and there's a hard deadline on someone else's budget of 4 weeks, and you can of course decline to ship it, but then you'll need a new job. (and I know, you'll say "Gladly, I take pride in my engineering!", but again, you're probably going to end up maintaining a project that only survived by doing exactly what you quit over)
tl;dr it's Yin and Yang: you can't have one without the other, and you need to have a bit of the other side in you whenever you're working in the capacity of either to be a good technical leader.
You'll figure out what you should have built after it's been used in prod for a while. Possibly years.
How many offers did you receive? Companies have also adopted your strategy: interviewing candidates "to see what's out there" - there's a job I interviewed for that's still open after 10 months.
When I was doing a lot of hiring we wouldn't take the job posting down until we were done hiring people with that title.
It made a couple people furious because they assumed we were going to take the job posting down when we hired someone and then re-post a new listing for the next person.
One guy was even stalking LinkedIn to try to identify who was hired, without realizing that many engineers don't update their LinkedIn. Got some angry e-mails. There are some scary applicants out there.
Some times a specific job opening needs to stay open for a long time to hire the right person, though. I can recall some specific job listings we had open for years because none of the people we interviewed really had the specific experience we needed (though many falsely claimed it in their applications, right until we began asking questions)
If you need to wait YEARS to hire someone with some specific experience, I can guarantee that you really didn't need that person. You're doing this just to check some specific artificial goal that has little to do with the business.
There's a difference between "critically needing" and "would benefit from."
If you can find the specialist who's done what you're doing before at higher scale and help you avoid a lot of pain, it's awesome. If not, you keep on keeping on. But as long as you don't start spending too much on the search for that candidate, it's best to keep the door open.
There is no requirement that every job opening needs to be urgently filled.
You keep repeating this like it means the job opening shouldn't exist at all. Not all job openings are for urgent demands that must be filled right away or not exist at all.
Option 1) Hire someone sub-standard and deal with either an intense drag on the team while they came up to speed or worst case having to manage them out if they couldn't cut it.
Option 2) Give up the requisition which looked like an admission that we didn't really "need" the position, and also fails to help with senior management and director promotions tied to org size.
This always seemed pathological to me and I would have loved to have the ability to build a team more slowly and intentionally. Don't let all this criticism get to you.
Imagine working on voyager II .. or some old-ass banking software that still runs RPG (look it up, I'll wait), or trying to hire someone to do numerical analysis for the genesis of a format that supercedes IEEE float .. or .. whatever.
There are many applications for extremely specific skillsets out there. Suggesting otherwise is, in my opinion, clearly unwise
I've worked in specialized fields where it takes YEARS for the right candidate to even start looking for jobs. You need to have the job listings up and ready.
This was extremely true when we were working on things that could not be done remote (literal physical devices that had to be worked on with special equipment in office).
Engineers aren't interchangeable cogs.
> I can guarantee that you really didn't need that person.
So what? There are many roles where we don't "need" someone, but if the right person is out there looking for a job we want to be ready to hire them.
Engineers aren't cogs, but they are able to travel and you can hire them by other means that full-time employment. So I suspect that was probably what you were meant to do for your situation.
Nothing about this was mission critical or even all that important or you would have found a way to solve the problem or you did and it wasn't a problem to begin with. I'm in a field where people often want to hire me for some special thing like this, but it often turns out, most of my life would be spent idle because no one company has enough demand for me. I can consult instead and be busy all year, or I can take a job for someone that's OK with me being idle for 80% of my time. I prefer the former for multiple reasons but just making an example of why hiring for specialized roles that aren't mission critical is often not the thing you should be doing.
I don't know why you assumed that. We had teams. We just wanted to grow them.
We weren't sitting there waiting.
I don't know where you're getting these ideas. We weren't hiring people to repair a backlog of devices. Warranty and repair work typically goes to the contract manufacturer, for what it's worth.
Companies like to grow and develop more products. You need more people.
If this is true then those shouldn't even be public job postings. That sort of critical position is for headhunters
Why? Not everyone is on LinkedIn or has an updated profile.
Some of the best candidates I've hired were people who were in other states who were planning to move, but waiting for the right job opportunity to come up.
We also used recruiters.
Why does it make people so angry that we posted job listings for real jobs that we were really hiring for?
If only we had listened to HN comments and given up instead
I recommend the article "Up or Out: Solving the IT Turnover Crisis" [0] which gives a reasonable argument for doing exactly that.
Notes:
0 - https://thedailywtf.com/articles/up-or-out-solving-the-it-tu...
There's a lot of anger in this thread at companies for making obvious choices.
If the perfect applicant happens to be looking for a job and it can save us the time and churn of switching someone internally, then yes: I would prefer to hire that person.
> The whole hiring angle you describe seems silly in terms of process and expectations
I think the silly part of this thread is all of comments from people who think they know better how to operate a company they know nothing about the people who were in it.
I think we could all be a little more mindful of that in hiring. That waiting for perfection is itself a fallacy for all these reasons and plenty more.
Elsecomment and on Reddit, you'll see the attitude that their years of experience should be sufficient assurance for their prospective employer that they can pick up whatever other technologies are out there.
This is often coupled with the "you shouldn't need to learn new things outside of your 9-5."
Here, you are presenting a situation where a company would rather promote from within (counter job hopping culture) and would penalize someone who is not learning about new things that their current employer isn't using in the hiring process.
---
And you've mentioned it elsecomment too - it's about the risk. A company hiring an individual who isn't familiar with the technology and has not shown the ability to learn new material is more risky a hire than one who is either familiar with it professionally or has demonstrated the ability to learn new technologies.
That runs counter to the idea of the "best" candidate being the one who is most skilled but rather the "best" candidate being the one that is the least risky of a hire.
I screen hundreds of resumes a week when hiring. I know this very well.
Hiring the wrong person can easily be a net negative to the team. Hiring too fast and desperately hiring anyone who applies is doubly bad because it occupies limited headcount and prevents you from hiring the right person when they become available.
Building teams is a long game.
So if you don't have a job opening posted on the day they're sending out applications, you may miss your shot to hire them.
“We’re making do, but we’re kind of figuring out X as we go. That’s working for now, but the problems keep getting knottier as we grow and change—it works, but it’s expensive in terms of avoidable mistakes.
Nothing’s on fire, but if we ever got the chance, we’d value authentic expertise in this niche. But if it’s just ‘I could probably figure that out,’ we’ve already got plenty of that internally.”
Where a good hire ends up helping those internal people as they develop experience and expertise, and one that’s not right is worse than none at all.
That still takes a long time if random Senior Engineer X who's looking on LinkedIn is only 10% of the way there for what you'd need for a very specialized role.
It's a small engineering org, allegedly head-hunting one principal engineer for the whole org, so it's a single opening. 10 months later they are still hunting for their special snowflake.
> I can recall some specific job listings we had open for years because none of the people we interviewed really had the specific experience we needed
This is exactly what I mean. If you can go for years without filling a role, it's non-essential , and are in effect, "seeing what's out there". More and more companies are getting very picky on mundane roles, such as insisting on past experience in specific industries: "Oh, your extensive experience in low-latency comms is in telecoms? We prefer someone who's worked in TV broadcast, using these niche standards specifically, even though your knowledge is directly transferable. We don't want to waste 5 days on training"
For example, your company might need a full-time network admin once its network grows to a certain size and complexity. You won’t hit that level for three years but you’d hire the perfect person now if you found them even though they might be spending a lot of idle time scrolling Hacker News for the first year or two. At 5x the growth rate, you’d need that person within less than a year, and you might be less picky about whether they are coming from a TV or telecom shop.
More specialized.
If we wanted to train someone, we'd start with an internal candidate who was familiar with the other parts of the job and then train them on this one thing.
Hiring an outsider who doesn't know the subject matter and then teaching them is less efficient and more risky. It was better to have someone in the team learn the new subject as an incremental step and then backfill the simpler work they were doing.
If your hiring model is hiring multiple people through one posting, then you will probably get a lot fewer angry ex-candidates being weird (because they think you've lied to them since the posting is still up) by just sending out rejections that don't say that and just get the "we're no longer interested in you for this role" message across.
Nicer/more corporate language for both, of course.
No, this isn't possible unless you delay rejections letters until you hire someone.
We send letters as soon as the decision is made not to continue with that candidate.
Honestly it would be cruel to string them along any longer.
I've been running the same job ad for 2 years now, as a recruiter for a big Canadian bank. I've been laughed at for having ridiculously unrealistic standards. I've been accused of running ghost ads.
I'm in the process of hiring the 13th person using this same job ad for new and existing teams that need a very particular type of engineer.
On the hiring side, at least in tech: interviewing really sucks. It's a big time investment from multiple people (HR, technical interviewers, managers, etc).
I'm not saying it's impossible that companies are interviewing for fun, but it seems really unlikely to me anyone would want to do interviews without seriously intending to hire someone.
I know it sucks, I've sat on the other side if the interviewing desk many times, and the charade wastes everyone's time - the candidates most of all because no one values that.
> I'm not saying it's impossible that companies are interviewing for fun, but it seems really unlikely to me anyone would want to do interviews without seriously intending to hire someone.
It sounds like you've never had to deal with the BS that is headcount politics, which happens more at larger organizations due to more onerous processes. Upper management (director, VP) can play all sorts of games to protect a headcount buffer[1], and everyone down the chain has to waste their time pretending to be hiring just because the division heads want to "maximize strategic flexibility" or however they phrase it.
1. Which is reasonable, IMO. Large companies are not nimble when reacting to hiring needs. The core challenge are the conflicting goals thrust on senior leadership reporting to the C-Suite: avoiding labor shocks, and maximizing profitably -- the former requires redundancy, but the latter, leanness.
I am on the interviewing and screening side and understand what you're saying. I also empathize with the people I routinely reject who don't understand why they were rejected. It's hard to see why you might not be a right fit for a role.
> it seems really unlikely to me anyone would want to do interviews without seriously intending to hire someone.
I keep seeing this accusation thrown around and like you, I have a hard time seeing this. On the flip side, looking at it from the eyes of many disenchanted candidates, I can see how a theory like this is appealing and self-reinforcing.
Most prefer a greenfield project.
https://news.ycombinator.com/item?id=33394287
Large companies tend to over specialize and that’s where I see the “I’m a builder” types fall apart. That takes away agency, lowers skills, and removes variety from work. That’s when it stops being fun to me.
I would hope most people with the builder architype are otherwise fine to keep building and maintaining.
The Pick-Up Artist's Guide to Tech Interviewing, you should be writing.
The first 100 subscribers get a 50% off discount the month of March, you should be announcing on LinkedIn and Tiktok, and making passive income.
The rest of us experienced people with proven track records have to learn algorithms on the weekends despite having white hair.
A few years ago, when interest rates were 0% and companies were hiring at an unsustainable rate, I got a lot of criticism for cautioning engineers against non-coding roles. I talked to a lot of people who dreamed of moving into pure architect roles where they advised teams of more junior engineers about what to build, but didn't get involved with building or operating anything.
I haven't kept up with everyone but a lot of the people I know who went that route are struggling now. The work is good until a company decides to operate with leaner teams and keeps the people committing code. The real difficulties start when they have to interview at other companies after not writing much code for 3 years. I'm in a big Slack for career development where it's common for "Architect" and "Principal Engineer" titled people to be venting about how they can't get past the first round of interviews (before coding challenges!) because they're trying to sell themselves as architects without realizing that companies want hands-on builders now.
I'm no AI booster but I think this is exact scenario where AI-driven development is going to allow those non-coding developers to shine. They still remember how code works (and have probably still done PR review from time to time) so they're well placed to write planning documents for an AI agent and verify its output.
I left to a startup where I write code and design architecture. I even had a former coworker tell me "wow you're willing to do stuff like that at this point in your career?"
Did you get any offers yet? It seems the issue is not lack of interviews but lack of offers. Many companies are looking for a goldilocks candidate and are happy to pass on anything that doesn't match their ideal candidate
Semi related, holy hell do companies have a lot of interview rounds these days. It seems pretty standard to spread 5-6 Teams calls over the course of a month. I get that these are high salary, high impact roles and you want to get it right. But this feels really excessive. And I'm not talking about FAANG tech giants here. It's everyone, from startups to random midsize insurance companies.
Most resumes are not very good. Beyond the obvious problems like typos, there is a lot of bad advice on the internet that turns resumes into useless noise. Screen a lot of resumes and you'll get tired of seeing "Boosted revenue by 23% by decreasing deploy times by 64%." This communicates nothing useful and we all know that revenue going up 23% YoY was not attributable to this single programmer doing anything at all.
Often I'll get candidates into interviews and they light up telling me about impressive things they did at a past job with enough detail to convince me they know the subject, but their resumes are garbage because they've followed too many influencers.
So try to work on your resume first. Try different resumes. Rewrite it and see what makes interviewers take notice and what they ignore. The most common mistake is to write a resume once and then spam it to 100 jobs. I know it's not fun to change the resume or invest time into applying for a job that may not respond, but you know what else isn't fun? Applying to 100 jobs and not getting any responses because every hiring manager has 20 tailored resumes in their inbox ahead of yours.
Having a simple but clear LinkedIn profile helps. Many scoff at this, but it works. You don't have to read LinkedIn's social media feed or do anything with the site. Just set it up and leave it for them to find.
GitHub portfolios and other things have low relative value at most companies. There are some exceptions where someone will look at it and it might tip the balance in your favor, but it's a small optimization. You need to be perfect the resume first, get a LinkedIn that looks decent second, and only then think about the ancillary things.
I'm putting more time into cleaning up my LinkedIn profile since that's been my most reliable route into hiring pipelines (other than referrals and networking).
This is the "quantify everything" mantra career coaches have been repeating for decades. As the story goes, no company is going to care that you refactored the FooBar library in order to make bugs in the DingDang module easier to fix. You have to only write down things that moved some quantifiable business needle like revenue or signups, even if the link to that needle is tenuous. Obviously, this ends up penalizing hard working, talented devs who don't happen to be working in areas where wins are easily quantifiable.
It's the useless quantification that turns resumes into noise, combined with making claims that you changed revenue by yourself.
> You have to only write down things that moved some quantifiable business needle like revenue or signups, even if the link to that needle is tenuous. Obviously, this ends up penalizing hard working, talented devs who don't happen to be working in areas where wins are easily quantifiable.
Every hiring manager knows this game and sees right through it. You can't read 1000 resumes with claims of "Increased revenue by 13% by" followed by something that clearly was not the reason revenue increased 13% to become numb to it.
Nobody believes these.
The somewhat useful quantifications are things like "Reduced cloud spend by 50% by implementing caching". This can spark a conversation about how they diagnosed they issue, made a transition plan, ensured it was working, and all of the other things we want to hear about.
This is a person who you're going to be reviewing their code or reading the documentation that they write.
If there are typos and poor formatting in the resume (that they've had the leisure of reviewing themselves and correcting), what does this say about the quality of the code the code or documentation that they're going to write when under a time constraint?
Are you going to be faced with the decision of having code with variables that have spelling errors and documentation that is grammatically or factually incorrect go through because of the time pressure?
The resume itself is a demonstration of the applicant's ability to pay attention to the details that matter in software development without showing a single line of NDAed code.
Everyone has seemingly adopted the FAANG playbook for interviewing that doesn’t really select for people who like getting their hands dirty and building. These kinds of interviews are compliance interviews: they’re for people who will put in the work to just pass the test.
There are so many interviews I’ve been in where if I don’t write the perfect solution on the first try, I’ll get failed on the interview. More than ever, I’m seeing interviewers interrupt me during systems or coding interviews before I have a chance to dig in. I’ve always seen a little bit of this, but it seems like the bar is tightening, not on skill, but on your ability to regurgitate the exact solution the interviewer has in mind.
In the past I’ve always cold applied places and only occasionally leaned on relationships. Now I’m only doing the latter. Interviewees are asked to risk asymmetrically compared to employers.
You've been interviewing forever. You're the well practiced pickup artist of job searching. Of course you'll be getting the call backs over the other 1000 applicants who don't have the same experience level applying. You "just know" how to read between the lines and tailor a resume, whip up a cover letter, etc whereas they're making mistakes.
The majority of engineers, in my hiring experience, failed very simple tests pre-AI. In a world where anyone can code, they're no better than previously non-technical people. The CS degree is no longer protection.
The gap between average and the best engineers now, though, is even higher. The best engineers can visualize the whole architecture in their head, and describe exactly what they want to an AI - their productivity is multiplied, and they rarely get slowed down.
While this could be done by junior or senior, I think junior usually has the slight advantage in being more AI-native and knowing how to effectively prompt and work with AI, though not always.
AI has fundamentally broken the education system in a way that will take decades for it to fully recover. Even if we figure out how to operate with AI properly in an educational setting in such a way that learners actually still learn, the damage from years of unqualified people earning degrees and then entering academia is going to reverberate through the next 50 years as those folks go on to teach...
That time when you got to internalise through first hand experience what good & bad look like is when you built the skill/intuition that now differentiates competent LLM wielding devs from the vibers. The problem is that expectations of juniors are inevitably rising, and they don't have the experience or confidence (or motivation) to push back on the 'why don't you just AI' management narrative, so are by default turning to rolling the dice to meet those expectations. This is how we end up with a generation of devs that truly don't understand the technology they're deploying and imho this is the boringdystopia / skynet future that we all need to defend against.
I know it's probably been said a million times, but this kinda feels like global warming, in that it's a problem that we fundamentally will never be able to fix if we just continue to chase short term profit & infinite growth.
I would say that baptism by fire _is_ where the quality of an academic education comes from, historically at least. They are the same picture.
In my experience, target schools are the only universities now that can make their assignments too hard for AI.
When my university tried that, the assignments were too hard for students. So they gave up.
Education and training and entry level work build judgement.
AI is either the next wheel or abysmal doom for future generations. I see both and neither at the same time.
In corporate environment where navigating processes, politics and other non-dev tasks takes significantly longer than actual coding, AI is just a bit better google search. And trust me, all these non-dev parts are still growing and growing fast. Its useful, but not elevating people beyond their true levels in any significant way (I guess we can agree ie nr of lines produced per day ain't a good idea, rather some Dilbert-esque comic for Friday afternoon).
I think this must be part of it. I see so many posts about people burning a thousand dollars in AI credits building a small app, and I have no idea why. I use the $20 Claude plan and I rarely run out of usage, and I make all kinds of things. I just describe what I want, do a few back-and-forths of writing out the architecture, and Claude does it.
I think the folks burning thousands of dollars of credits are unable to describe what they want.
But juniors don't (usually) have the knowledge to assess if what the AI has produced is ok or not. I agree that anybody (junior or senior) can produce something with AI, the key question is whether the same person has the skills to asses (e.g., to ask the right questions) that the produced output is what's needed. In my experience, junior + AI is just a waste of money (tokens) and a nightmare to take accountability for.
I perceive the AI itself as a very fast junior that I pair program with. So you basically need the seniority to be able to work with a "junior ai".
The bar for human juniors is now way higher than it used to be.
I very much follow the pattern of having the whole architecture in my head and describe it to the AI which generates the appropriate code. So now the bottlenecks are all process related: availability of people to review my PRs, security sign offs on new development, waiting on CI builds and deployments, stakeholder validation, etc. etc.
Did you consider tech whiteboard / leetcode interviews are unnatural stressful environments ? Have you gone through a mid/difficult technical appraisal yourself lately ? Try it out just to get an idea how it feels on the other side...
I always asked a simple question like here is an array full of objects. Please filter out any objects where the "age" property is less than 20, or the "eye color" property is red or blue. It was meant more as a sanity check that this person can do basic programming than anything else.
Tons and tons of people failed to make basically any progress, much less solve the problem, despite saying that they worked programming day to day in that language. For a mid level role I would filter out a good 8 or 9 out of ten applicants with it.
I would consider it a non-leetcode type of question since it did not require any algorithm tricks or any optimization in time/space.
Nowadays that kind of question is trivial for AI so it doesn't seem like the best test. I'm not hiring right now,.but when I do I'm not sure what I will ask.
You're assuming the question has to even be that difficult. I've proctored sessions for senior-level webdev roles where the questions were akin to "baby's first React component" -- write a component that updates a counter when you click a button. So many candidates (who purported to be working with React for years) would fail, abysmally. Not like they were just making small mistakes; I didn't even care about best practices -- they just needed to make it work. So many failed. Lot of frauds out there.
There are so many software engineering candidates who literally cannot write the simplest code. I even had someone actually say "I don't really write code at my current job, I'm more of a thought leader." Bzzzzzt.
I've always prepared what I called level 1, level 2, and level 3 questions ready for candidates. But, I almost never even got to level 2, and never in 20 years of interviewing got to my level 3 questions.
I've been around the block for over 3 decades. I've had a number of high level positions across both IC and management tracks. These days I'm very hands on keyboard across a number of clients. If you asked me to write a basic for loop or if statement, there's a small chance I'd flub the exact syntax if writing on a whiteboard. Both because I bounce between languages all day and wires get crossed on the fly, but also the standard interview pressure type arguments. Whereas if the test is "does this person understand what a for loop is and how it works?", then yes, I can easily demonstrate I do.
In real life I'm not going to take an interview where there's not already that degree of trust so if that questions comes up something is already wrong. But I'm sure there are interviewers in the world who'd fail someone for that.
One of the worst guys took 20 minutes, with me having to coach him through it the entire time. It was a true exercise in patience, but I don't mind helping people learn new things. When he got his rejection email, he actually complained to the recruiter because he thought he did really well. Dude...
Half of the people I screen fail it. It's crazy.
https://blog.codinghorror.com/why-cant-programmers-program/
Most interviewees failed fizzbuzz, and that was 20 years ago.
It’s been well over a decade that I’ve had to do the coding interview monkey dance and I actually turned down an offer where I did pass a coding interview because I found it insulting and took a job for slightly less money where the new to the company director was interested in a more strategic hire (2016). That was the same thing that happened before in 2014 and after in 2018 - a new manager/director/CTO looking for a strategic hire.
In fact even my job at BigTech -AWS ProServe (full time blue badge RSU earning employee) as a customer facing consultant specializing in app dev was all behavioral as well as my next full time job as a staff consultant in 2023.
I’m 51 years old and was 40 in 2014. If I’m still trying to compete based on my ability to reverse a b tree on the whiteboard even at 40, I have made some horrible life decisions.
(Well actually I did make a horrible life decision staying at my second job too long until 2008 and becoming an expert beginner. But that’s another story)
I can never get over how this became a thing. Was listening to a Brian Cox video on YouTube the other night (something about his voice helps me sleep). He said "I don't memorize formulas, it's easy to look them up."
If you ever need to reverse a b tree (in 30+ years of writing code, I never have) it's easy to look that up. It tells me nothing about your ability as a developer of real software that you spent time memorizing trivia before an interview.
It's a contrived scenario, but the whole point is that it measures min(a,b) where `a` is your ability to think, and `b` is your ability to prepare (and memorize answers ahead of time). (I'd personally try to find ways to measure `a` instead of `b`, maybe by asking questions people wouldn't have heard before.)
So much of tech hiring cargo culting has been built up around leetcode and other coding problems, puzzles, and more. We all pay lip service to systems thinking and architecture, but I question if even those are testing the correct things for the modern era.
And then what happens in a year when the models can handle that as well?
Let them use their preferred setup and AI to the full extent they want, and evaluate their output and their methodology. Ask questions of "why did you choose X over Y", especially if you're skeptical, and see their reasoning. Ask what they'd do next with more time.
It's clear when a candidate can build an entire working product, end-to-end, in <1 day vs. someone who struggles to create a bug-free MVP and would take a week for the product.
In addition to the technical interview, hiring them on a trial basis is the absolute best if possible.
Taste and technical understanding of goals and implementation to reach those goals is the biggest differentiator now. AI can handle all the code and syntax, but it's not great at architecture yet - it defaults to what's mid if not otherwise instructed.
I do feel like there's something *different* about the required skillset now, and it's not something that all engineers have even experienced ones. But I can't put my finger on what exactly it is. If I'm right though, classic interview techniques won't select for it because they never were intended to do so.
Either the machines exterminate us or we become glorified pets.
Hope the AIs prefer us to cats (even though that's a long shot).
An amateur with a chess engine that blunders 10% of the time will hardly play much better than if they didn't use it. They might even play worse. Over the course of a game, those small probabilities stack up to make a blunder a certainty, and the amateur will not be able to distinguish it from a good move.
However, an experienced player with the same broken engine will easily beat even a grandmaster since they will be able to recognise the blunder and ignore it.
I often find myself asking LLMs "but if you do X won't it be broken because Y?". If you can't see the blunders and use LLMs as slot machines then you're going to spend more money in order to iterate slower.
I guess? I don't really see why that would be the case. Being a senior is also about understanding the requirements better and knowing how/what to test. I mean we're talking about prompting text into a textarea, something I think even an "old timer" can do pretty well.
I'm not sure why junior engineers would be any better at that though, unless it's just that they're approaching it with less bias and reaping beginners luck.
Before gen AI, I used to give candidates at my company a quick one-hour remote screening test with a couple of random "FizzBuzz"-style questions. I would usually paraphrase the question so a simple Google search would not immediately surface the answer, and 80% of candidates failed at coding a working solution, which was very much in line with the article. Post gen AI, that test effectively dropped to a 0% failure rate, so we changed our selection process.
[1] https://blog.codinghorror.com/why-cant-programmers-program/
I'd go a step further and say the engineers who, unprompted, discover requirements and discuss their own designs with others have an even better time. You need to effectively communicate your thoughts to coding agents, but perhaps more crucially you need to fit your ever-growing backyard of responsibilities into the larger picture. Being that bridge requires a great level of confidence and clear-headedness and will be increasingly valued.
I should have a credential I have to maintain every few years, one or two interviews, and that should get me a job.
We have a lot of people where if you gave them clear requirements, they could knock out features and they were useful for that, but I have an army of agents that can do that now for pennies. We don't need that any more. We need people who have product vision and systems design and software engineering skills. I literally don't even care if they can code with any competency.
Btw, if you think that copying and pasting a jira ticket into claude is a skill that people are going to pay you for, that is also wrong. You need to not just be able to use AI to code, you need to be able to do it _at scale_. You need to be able to manage and orchestrate fleets of ai agents writing code.
I graduated 9 months ago. In that time I've merged more PRs than anyone else, reduced mean time to merge by 20% on a project with 300 developers with an automated code review tool, and in the past week vibe coded an entire Kubernetes cluster that can remotely execute our builds (working on making it more reliable before putting it into prod).
None of this matters.
The companies/teams like OpenAI or Google Deepmind that are allegedly hiring these super juniors at huge salaries only do so from target schools like Waterloo or MIT. If you don't work at a top company your compensation package is the same as ever. I am not getting promoted faster, my bonus went from 9% to 14% and I got a few thousand in spot bonuses.
From my perspective, this field is turning into finance or law, where the risk of a bad hire due to the heightened skill floor is so high that if you DIDN'T go to a target school you're not getting a top job no matter how good you are. Like how Yale goes to Big Law at $250k while non T14 gets $90k doing insurance defence and there's no movement between the categories. 20-30% of my classmates are still unemployed.
We cannot get around this by interviewing well because anyone can cheat on interviews with AI, so they don't even give interviews or coding assessments to my school. We cannot get around this with better projects because anyone can release a vibe coded library.
It appears the only thing that matters is pedigree of education because 4 years of in person exams from a top school aren't easy to fake.
People are posting about pull requests, use of AIs, yada yada. But they never tell us what they are trying to produce. Surely this should be the first thing in the post:
- I am developing an X
- I use an LLM to write some of the code for it ... etc.
- I have these ... testing problems
- I have these problems with the VCS/build system ...
Otherwise it is all generalised, well "stuff". And maybe, dare I say it, slop.
edit: to clarify, I'm using recc which wraps the compiler commands like distcc or ccache. It doesn't require developers to give up their workspace.
Right now I'm using buildbarn. Originally, I used sccache but there's a hard cap on parallel jobs.
In terms of how LLMs help, they got me through all the gruntwork of writing jsonnet and dockerfiles. I have barely touched that syntax before so having AI churn it out was helpful to driving towards the proof of concept. Otherwise I'd be looking up "how do I copy a file into my Docker container".
AI also meant I didn't have to spend a lot of time evaluating competing solutions. I got sccache working in a day and when it didn't scale I threw away all that work and started over.
In terms of where the LLM fell short, it constantly lies to me. For example, it mounted the host filesystem into the docker image so it could get access to the toolchains instead of making the docker images self-contained like it said it would.
It also kept trying to not to the work, e.g. It randomly decides in the thinking tokens "let's fall back to a local caching solution since the distributed option didn't work" then spams me with checkmark emojis and claims in the chat message the distributed solution is complete.
A decent amount of it is slop, to be honest, but an 80% working solution means I am getting more money and resources to turn this into a real initiative. At which point I'll rewrite the code again but I'll pay closer attention now that I know docker better.
Are you sure they don't have to fix the build pipeline first? Tens of thousands of vCPUs for a single compilation run, or to accommodate 100 developers who try to compile their own changes?
Sorry, I wasn't clear. I am not virtualizing the workspace. I'm using `recc` which is like `distcc` or `ccache` in that it wraps the compiler job. Every developer keeps their workstation. It just routes the actual `clang` or `gcc` calls to a Kubernetes cluster which provides distributed build and cache.
> Isn't there a less convoluted way of making the best engineers leave?
We have 7000+ compiler jobs in a clean build because it is a big codebase. People are waiting hours for CI.
I'm sure that drives attrition and bringing that down to minutes will help retain talent.
> Tens of thousands of vCPUs for a single compilation run, or to accommodate 100 developers who try to compile their own changes?
Because it uses remote execution, it will ideally do both. My belief is that an individual developer launching 6000 compiler jobs because they changed a header will smooth out over 300 developers that generally do incremental builds. Likewise, this'll eliminate redundant recompilation when git pulling since this also serves as a cache.
It happens when someone modifies a widely included header file. Which there are a lot of thanks to our use of templates. And this is just our small team of 300 people.
> Have you thought about splitting that giant thing in smaller chunks?
Yes. We've tried but it's not scaling. Unfortunately, we've banned tactics like pImpl and dynamic linking that would split a codebase unless they're profiled not to be on a hot path. Speed is important because I'm writing tests for a semiconductor fab and test time is more expensive than any other kind of factory on Earth.
I tried stuff like precompiled headers but the fact only one can be used per compilation job meant it didn't scale to our codebase.
It sounds like you have a job, right out of college, but you're griping about not getting promoted faster. People generally don't get promoted 9 months into a job.
I'm reading your post and I am genuinely impressed but what you claim to have done. At the same time I am confused about what you would like to achieve within the first year of your professional career. You seem to be doing quite well, even in this challenging environment.
How many juniors OpenAI GDM are going to hire in a year, probably double digits at max, the chances are super slim and they are by nature are allowed to be as picky as they should be.
That being said, I do agree this industry is turning into finance/law, but that won’t last long either. I genuinely can’t foresee what if when AGI/ASI is really here, it should start giving human ideas to better itself, and there will be no incentive to hire any human for a large sum anymore, maybe a single digit individuals on earth perhaps
Because AI accelerates the rate of knowledge gain, this gets even faster.
Someone who jumps higher than expected when the boss demands it?
Someone who works 996 in the office?
Or someone who knows what they’re doing?
I think this is bigger than any individual. It’s just a matter of time before you’re let go. There’s no loyalty from companies at all. Not when they’re seeing higher than expected profits and are still cutting huge percentages of staff every year. There’s no strategy or preference to it. I don’t think this has to do with how you or I perform on the job.
Most people I’ve talked to lately who are still employed are watching out for their job to get cut.
Before that role, I spent two years at another government contractor working on various govt. applications doing UX research, design, and front-end UI development. Overall, I’ve had a 17-year career in UX Research, Design and Development, starting at an ad agency in 2009.
From 2016 to 2022, I worked hard in government projects and enjoyed collaborating with great, close-knit coworkers and receiving consistently positive client feedback. From 2022 to 2026, things changed as the company grew—my role narrowed to UX research and design while newer hires handled UI development. I often felt underutilized and raised it, but management assured me I was doing well. With little direction from my last manager, I focused on staying visible to the client by monitoring user chats, identifying UX issues, and proposing design solutions that the client appreciated and the development team implemented.
Looking at where the tech industry is now—with thousands laid off from government IT and the broader tech sector flooding the job market, creating rising competition, constant pressure to work harder (Elon wants us to work as hard as Chinese workers do) and AI rapidly reshaping creative and development roles—I’m not very interested in that level of stress. I worked hard for many years and enjoyed it, but I value MY LIFE and MY HEALTH more than participating in the current “battle royale” environment in tech.
Overall, now with AI I feel graphic & web design, as well as front design web development is a stupid career! It was a nice run, bought two houses from it, worked remotely, when things were slow worked from wherever in the lower 48 and now .... in April Im starting nursing school and Im not young (20 years left of work in me). Roll with the punches here yet the punches are gonna punch hundreds of thousands to millions in the face ... not sure how this any good for an economy and society but here we are! If you are like me sell your house and stash the money away to buy houses when the crash from AI happens!
I suspect that nursing is an excellent choice.
I got pushed out, and slapped with the Dead Fish of SV Ageism. It was brutal, and I got pissed off.
But in the long run, it's been the best thing that ever happened to me. I would have liked the extra ten years of salary and saving, but I'm not entirely sure that I would have survived it.
> The retreat challenged the narrative that AI eliminates the need for junior developers. Juniors are more profitable than they have ever been. AI tools get them past the awkward initial net-negative phase faster. They serve as a call option on future productivity. And they are better at AI tools than senior engineers, having never developed the habits and assumptions that slow adoption.
> The real concern is mid-level engineers who came up during the decade-long hiring boom and may not have developed the fundamentals needed to thrive in the new environment. This population represents the bulk of the industry by volume, and retraining them is genuinely difficult. The retreat discussed whether apprenticeship models, rotation programs and lifelong learning structures could address this gap, but acknowledged that no organization has solved it yet.
I would not mind switching but 1. I don’t see interesting positions 2. they don’t pay well, and only 3. they might not even want me.
It might also be just my niche, but finding a good position feels completely impossible for me.
I am doing cross platform mobile development and I’m wondering how I could transition into backend development or I started even considering the decentralized finance…
3.5 years ago was peak ZIRP hiring craziness.
It wasn't a normal reference point.
My resume isn't bad on paper either. It's not FAANG coded, but it's decent experience.
They're just as capable of typing prompts into AI, but what they don't have is good judgement of what good work/code looks like, so what's the point of asking a junior engineer to do something vs asking the LLM directly?
Nobody is gonna lose money because some script that generates yaml for the build process every hour nested three loops instead of two. Intern, AI, junior dev, junior dev telling an intern how to use AI, doesn't matter. If it works for the week it'll work for the decade. If someone needs to pick it apart and fix something in a year it'll either take no time because they know enough to do it easily or it'll be a good low stakes learning exercise for a junior.
Everyone wants to think their stuff is important but 99.9% of code is low stakes support code either in applications or in infrastructure around them.
This is the K-shaped economy playing out. Its a signal that the american middle class is hollowing out. Bad, very bad.
Wouldn't the assumption be the opposite, in that AI is magnifying the decision making of the engineer and so you get more payback by having the senior drive the AI?
I suspect a lot of it best practices will be enforcing best practices via agents.md/claude.md to create a more constrained environment.
Juniors seem to split into the category of trust everything the ai says, or review every step of the implementation. It’s extremely hard to guide the ai while you are still learning the basics, opus4.6 is a very powerful model.
Quite often the AI guesses accurately and you save the time you'd have spent crafting the perfect prompt. Recently, my PM shared a nigh-on incompressible hand-scribbled diagram on Slack (which, in fairness, was more or less a joke). I uploaded it to Gemini with the prompt "WTF does this diagram mean?". Even without a shred of context, it figured out that it was some kind of product feature matrix and produced a perfect three paragraph summary.
I've never really seen the value in the planning phase as you're free to just throw away whatever the AI produces and try again with a different prompt. That said, I don't pay for my tokens at work. Is planning perhaps useful as a way of reducing total token usage?
This could simply be a matter of style however.
Being able to clearly describe a problem and work with the AI to design a solution, prioritise what to put the AI to work on, set up good harnesses so the quality of the output is kept high, figure out what parallelises well and what’s going to set off agents that are stepping on each others toes… all of this needs experience and judgement and delegation and project organisation skills.
AI is supercharging tech leads. Beginners might be able to skill up faster, but they’re not getting the same results.
For average to low-performing intermediates/seniors... there's not much difference in output between them and a good junior at this point. Claude really raised the skill floor for software development.
I find it easier to get a reasonably smart senior to use AI in a good way, than to train a junior in what thinking to do, and what to outsource, learning basics about good design, robustness and risk analysis. The tools aren't the problem per se, it's more about how people use them. Bit of a slippery slope.
That's just my anecdotal experience from not a whole lot of data though. I think the industry will figure it out once things calm down a bit. Right now, I usually make the bet to get one senior rather than two juniors. Quite different to my strategy from a few years ago.
While I could buy that hiring managers believe this, it's not actually true.
The gulf between the quality of what a sr developer can do with these tools and what a jr can do is huge. They simply don't know what they don't know and effective prompting requires effective spec writing.
A rando jr vibe coder can churn out code like there's no tomorrow, but that doesn't mean it's actually right.
This is also where micro services pattern fits in well because individual unit is so small no design needed.
I’ll bite: why? Genuine question, not a weird gotcha.
Seniors have much more advantage right now in using AI than Juniors. Seniors get to lean in on their experience in checking AI results. Juniors rely on the AI's experience instead, which isn't as useful.
Truth is, when I was part of larger orgs/enterprise I definitely saw some folks who were dead weight, and I don’t mean to be harsh, a few of these knew they weren’t contributing and were being malicious in that sense.
Similarly, I wonder how many high performers now are taking multiple jobs thanks to remote work and exposing the mid to low performers. Like some kind of developer hypergamy taking place.
I've been looking again this year and the landscape has changed drastically. Specialization is the name of the game, I have a good amount of experience working with Growth initiatives and I've been getting good responses from roles that are looking for either Growth or Design engineers, roles that were not as prevalent years ago.
That sounds good for many of us (and don’t we all like to think we’re top candidates here on HN…) but is there any data to back this up? Or it just anecdata (not to dismiss anecdata, still useful info).
That is pretty context sensitive. You're correct that there's no real deep AI use expertise broadly understood to exist at this point (unless you're Steve Yegge?), but if people think they can toss out the engineers with experience in the systems that have been around a while, with junior developers "guiding" changes — that's likely a good way for a business to fall on its sword.
People with experience and/or credentials desired by companies in areas of growth (i.e. AI) are always in high demand
This is tautological.
Apparently it is over a third affected in my domain. Which is crazy. Pretty much everyone in my immediate band has been hit at some point. That that weren't were usually around 5-8 above me. So basically a different generational band altogether.
As was foretold in the Tyler Cowen's eponymous 2013 book "Average Is Over".
In it he argued that the modern economy will undergo a permanent shift where "average" performance no longer guarantees a stable, middle-class life.
He predicted that the economy will split into two distinct classes: a high-earning elite (roughly 10–15% of the population) who thrive by collaborating with technology, and a larger group (85–90%) facing stagnant wages and fewer opportunities.
AI summary of the other key points of that book:
The "Man + Machine" Advantage: Success will belong to those who can effectively use smart machines. Cowen uses Freestyle Chess (teams of humans and computers) as an analogy, noting that human intuition combined with machine processing power consistently outperforms either working alone.
The Power of Conscientiousness: In a world of abundant information, the scarcest and most valuable traits will be self-motivation, discipline, and the ability to focus. Hyper-Meritocracy: Advanced data and machine intelligence make it easier for employers to measure an individual's exact economic value. This leads to extreme salary inequality as top performers are identified and rewarded more precisely.
A New Social Contract: Cowen predicts a future where individuals must be more self-reliant. He suggests society will move toward lower-cost living models for the non-elite, featuring cheaper housing and "bread and circuses" in the form of low-cost digital entertainment and online education.
EDIT: Notice how we're basically already here: Netflix is cheap, YT is free, Khan Academy and MIT OCW is free, Coursera/Udemy/etc. are cheap.
Stagnant vs. Dynamic Sectors: The economic divide is worsened by "low accountability" sectors like education and healthcare, where productivity is hard to measure and costs continue to rise, unlike tech-driven sectors that see rapid gains.
If intermediates were being pushed out they would just take junior roles to have something
Companies really don’t like hiring Juniors in general
https://www.folklore.org/Negative_2000_Lines_Of_Code.html
What we really need is the -10X engineer ;)
Alas, his job would entirely consist of debloating the slop everyone else is pumping out "at inference speed".
https://steipete.me/posts/2025/shipping-at-inference-speed
You can be a great unblocker, team lead, and work well within cross cutting areas and with interdepartmental stake holders, have a history of strong technical performance.
and yet its nebulous if that means you're a high performer or not to those hiring. It seems I'm seeing 'culture fit' as a common reason people aren't getting hired again. That was out of vogue for a good while.
I've noticed a huge tightening of the rope around that sort of thing.
I can't tell you how many times I've passed all the tests, all the interview things, get to the final round with the team and the rejection email comes in despite having good conversations. By all accounts, I believe any person would say the interview went well.
my peers are reporting the same things.
I haven't found that to be true. Unless by "top candidates" you mean people working at actual AI companies such as Alphabet/Meta/OpenAI/Anthropic. If you're an AI-user and not an AI scientist it's bad out there, even for senior+ developers who previously worked in "FAANG".
HN user: not in my experience!
It's pretty depressing. I'd take just about anything at the moment. I understand desperation going into a job interview isn't ideal either.
It feels like I'm in a hole.
Happy to be on the high-end ^^.
Tell me about all the junior developers you've hired (it's none)
This is probably the dumbest take I've heard of. They're the most likely to make mistakes with AI because they don't know the pitfalls of what they're doing.
Also this only captures 6 industries, which is a narrow view of what would define "tech" these days.
Not to say that the job market isn't tough but this graph is a very narrow view
Can’t believe how many people are commenting without looking at what the chart means. We’ve lost 50k jobs last two years after decades of adding 100k+ every year including the pandemic highs of 300k+ per year. Total employment remains way above 2000s, 2008 and 2020 unlike the title suggests.
I'm not even sure this chart tells the story of the title.
The health of the market is not a function of the total number of jobs alone, it's a function of the number of jobs and the number of people to fill them.
The number of total jobs going up year after year meant that there were increasing numbers of candidates, new people entering the field. If the job growth stops, then there still we be candidates coming in. There will also be the new hires from the last decade moving into increasingly senior roles, and there won't be space for them (unless you devalue the meaning of "senior" even more).
So the year over year change matters a lot. If it plateaus, or even declines slightly, it's more than enough to make a terrible market.
Tech employees: 5.5m vs 9.9.
Software developers: 0.68m vs 3.2m.
Different ball game.
EDIT: posted below as well https://xcancel.com/JosephPolitano/status/202991636466461124...
There’s a longer term graph in the thread. We’ve got a long way to go before we hit 2000 numbers which is what I’d expected.
For context, I had my 2600 square foot 3.5/2 bedroom house built that year for $175K.
Healthcare
Education (not just for learning, but for signaling).
Everything else is inconsequential in my budget.
I was working at a company that printed bills for utility companies and had offers from banks, insurance companies etc. The world didn’t stop buying Coca Cola, flying Delta or stop buying stuff from Home Depot because of the dot com crash
I don't think the market is flooded with new devs as many state, I think we are in a deep silent crisis
"He's been unemployed for 13 months? Why doesn't anyone want to hire him? Must be something wrong with him"
Does that make sense?
You guys keep trying to put a pillow over our face.
I blame this on people spamming fake AI CVs 24/7, no one is going to review hundreds of CVs.
Maybe people just have their fingers in their ears but this has been a problem for years now
My friends who are still at Google also say that most job postings will end up going to someone internally - in fact people say they don't do that many external interviews anymore.
Finally the interview cycle seems to take a lot longer than I remember with quite a few added rounds.
The primary thing going on in the market right now is a lot of companies simply over-hired during the post Covid boom and they’re correcting for that.
I'm a c++ dev, with excellent senior tests, but low experience, and no degree in France. 3 years without a job.
I yearn for a new pandemic.
Fortunately, I learned how to live without a job, found other things to do and how to live a life. Welfare is generous, and I have good savings.
Honestly I don't really want to work in software anymore. If there is a job offer and recruiters are calling me, I answer and I accept.
But I'm not applying to all positions I can see and I won't run after them.
YMMV but that’s coming from a guy who writes in at least 3 languages at current $dayjob.
Oh to be French
TL;DR - at least in my little bubble, the C++ systems engineer market has been consistently hiring people, though good engineers are hard to find.
At the same time, the economy at large didn't seem to change very much.
Why did this happen?
What is happening now is the unwinding of the above. Now its:
Higher rates + AI + too many SWEs (bootcamps and over-hiring) = Busting economy
I think what we are in right now is more the norm and the post covid boom was an exception.
[1]: found a ref https://www.bnncpa.com/resources/one-big-beautiful-bill-act-...
https://www.citadelsecurities.com/news-and-insights/2026-glo...
I got 4 or 5 standard rejections.
I have non-English name so that definitely hurts. I have AP EAD which is a stage between H1B and Green Card and I still require sponsorship. It's complicated but I can't just switch to EAD right away.
It's not just engineers. It's managers and experienced people as well. Don't believe top comment that it is bimodal. Unless you are supertar (99.99%) it becoming hard to get noticed. I thought of going back to IC role but it is hard to pick up and do leetcode all over again. It is extremely hard with a special needs kids at home.
Any suggestions or recommendations for me?
I had managerial* position I ghosted because they had Leetcode literally written on the agenda.
* - managerial is replaced with Lead. Lead is expected to be hands-on as well as have serious managerial experience. Since it's easier to lie about managerial experience, you have people lying into these roles and becoming terrible managers.
Also, that spike in 21/22 really did a number on people's expectations. The one constant in this industry is its cyclical nature.
If it continues, then yes it could be bad, but so far it seems like a correction for over-hiring in 2021 - 2023. Seems a little weird to be focusing on a decline in 2024 - 2026, without addressing the large increase right in the years before.
Tech employees: 5.5m vs 9.9.
Software developers: 0.68m vs 3.2m.
Different ball game.
I had no idea I was in such an exclusive group back in 2000. Everyone I knew was a software engineer or in tech one way or another so I suppose I got a warped sense that I belonged to a larger group.
In the 90s tons of people who were de facto software engineers were listed as "Information Technology Workers". I suspect a lot of that still hasn't been shaken out of the system.
According to the BLS in the year 2000 there were 3.4 million information technology workers.
Today there are computer programmers (15-1251), and software developers (15-1252), and web developers (15-1254).
In 2018, there was a reclassification - https://www.dol.gov/sites/dolgov/files/ETA/oflc/Presentation... where 15-1132, Software Developers, Applications and 15-1133, Software Developers, Systems Software where reclassified into the software developers (15-1252) group.
The other thing that confuses this is that a lot of positions were classified as Computer systems analysts because that's a position that a TN visa can be hired for (there is no software engineer in there... and it wasn't until relatively recently that one could be a "software engineer" in Canada without being an Engineer.
Back in 2010 ... https://www.bls.gov/cps/cenocc2010.htm
Where the "Computer programmer" was the more junior classification and Software developers working on a word processor were classified differently than a software developer working on the operating system... and they were the more senior positions.This division still shows up in the definitions.
https://www.onetonline.org/link/summary/15-1252.00
https://www.onetonline.org/link/summary/15-1251.00Wow. Just wow.
Most people would be thankful to have a secure well paying job in the post AI blow off; increasingly it's going to harder to differentiate yourself against anyone else using AI. That we have people still in the thick of AI that don't understand that is a strong signal that AI boom is still going to come take some jobs.
If you're in a software related role and AI isn't making you more productive, it's on YOU as a dev to figure things out quickly.
AI is coming for your job so you can either be an AI manager, or you can get managed out for AI.
caveat: This is my take as someone who used to do a lot of hand coding, and now regularly has a small team of AI doing anything that would have normally required mostly brute coding strength but not too much thought; that's facet'ed plots, refactoring libraries, improving pipeline efficiency, adding parallelization where possible, building presentations, adding test coverage.
I still kinda want to see this going back to 2000. That must be the biggest tech crash by far. 2008 and 2020 were overall market crashes, but tech was booming.
> Moral hazard is when one party takes actions that impose costs on others because they don’t fully bear those costs themselves. With ghost jobs, employers get benefits (brand signaling, resume mining, internal optics) while job seekers eat the time, emotional, and sometimes financial cost of chasing something that never really existed.
- really wants to hire H1B, but needs to pretend to interview first for compliance. These usually have absurd requirements to make it viable to reject anyone.
- really wants to do an internal or referral hire or promotion, but needs to interview for HR compliance. These usually have such specific requirements that only the person they want qualifies.
- posts jobs because a company wants to look like its growing, even when it's not.
- posts jobs to either signal to an employee that they are replaceable, or to try and relieve a stressed employee that more help is coming. Either way, it's a bluff
- yes, sometimes you want to hold out for the perfect unicorn and are not in any way in a rush to find them. There's no distinction for this, but job posts are cheap so why not?
- outdated posts that still stay up because There's no rush to take it down.
- a technique used to lower compensation. They post a job, see how many applications it gets. If it's more than enough, they take it down (with no interviews) then put it up once more at a lower rate. Repeat until not enough people apply. This may or may not lead to interviews because the actual goal is market probing.
-purely to advertise the company instead of actually hire. Usually done at career fairs where you talk and realize there's no actual open positions.
Can also happen when it takes 3 months to get a job posting approved, so once you get one you just leave it up.
The comp technique you mentioned though seems like a lot of work for price discovery, surely there are data sets out there?
There's a IT careers site that was sold, I believe, went through a re-branding. And now they also offer AI and "personal" resume reviews _and_ writing, cover letters, and they even have members do a 10-15 minute AI virtual interview that ostensibly could be shown to a hiring manager.
I was unemployed as a PM for about three month. I applied to in the order of 100 roles at this site, as well as applications on the other sites you'd expect, from LI to more niche.
I felt that this site was "underperforming". Jobs I'd applied to that I'd only really seen on there I'd never heard from. I saw jobs that were advertised in other places on there too.
What sealed it for me was that towards the end of the three months, I got an email from the site. "Your profile has been viewed". I open it, "An employer is looking at your profile". I'd never seen this type of email from them before, and sure enough: "Your profile has been viewed 1 time in the last 90 days". That was it. No contacts, and only one employer has even looked at my profile on the site (and this is the kind of site where that'd be the only place they could look at your application). And that employer didn't even have positions open.
But the site does ask you questions to "submit to the employer" about "why you want to work here" "why you'd make a good fit", etc.
And I'm entirely convinced that the jobs they're advertising are only (a very small) fractionally "real" and ever reviewed by anyone at all (maybe the "promoted" jobs?), and they're harvesting positions and jobs from other sites or employers (there's no positions that don't actually seem to exist, or at least not ads)...
... and that their chief motivation for this is getting all your answers to train their models for their actual revenue generator - AI resume writing, cover letter writing, etc. All pre-seeded with other people's real answers to such questions.
I've applied for many jobs where I was perfectly qualified and got rejection notices immediately. I applied on a Sunday and got rejected on Sunday an hour later. No human reviewed that application I made, it was auto rejected, and if that's the case, what other explanation is there than "ghost jobs."
You didn't pass some arbitrary ruleset given to an AI or machine learning algorithm.
Companies can be very selective now, and usually implement this selectivity fairly stupidly. There also is the problem of being genuinely swamped with bullshit applicants for positions, so the false positive rate is likely quite high at the moment.
I've found it extremely difficult to sort the wheat from the chaff right now. Finding competent people is more difficult than ever, but the sheer number of applicants is at least an order of magnitude higher. Botting has made applying to jobs exceedingly low friction, so there is very little downside to someone entirely not qualified to apply to 600 jobs a day and hope they get lucky.
We have positions that have been open for months that go unfilled simply due to lack of time to sort through applicants, and the few we do have time to interview usually are obviously unqualified within the first 5 minutes of talking to them.
I just have had lots rejections, and some where I did have a good fit, that I don't think "AI auto rejection" is the only story. I have good credentials, several F500 experiences, no big career gaps.
The only real success I have had in the last few years is targeted emails (from who is hiring on HN) or through my network.
It's very different than at any other time and I believe it is a combination of a terrible market, AI rejections, and ghost jobs. And I'm sure there are more than a few ghost jobs.
It also might point to a filtering mismatch of your get a high false positive rate.
Most of the good folks have come in via word of mouth and networks, as they typically do.
For those outstanding positions they are "very nice to haves" but obviously not critical. When the right candidate gets matched we'll jump on the opportunity, but it's not an existential problem for the moment.
This scenario isn't a "fake job," which are more akin to ghost/scam/non-existent openings.
also getting into plumbing, curious to see what others are doing in this regard.
Jobs are now significantly more demanding too, do more and make less.
For what its worth, I ended up getting a tech job in Japan instead. Ironically, the requirements at U.S. startups are much higher, and U.S. startups fit the stereotype of Japanese work culture more than Japanese companies nowadays.
https://muneebdev.com/software-development-job-market-india-...
"Other professional, scientific, and technical services" grew month over month and year over year
"Information" took a hit, but the bulk of that was "Motion picture and sound recording industries"
"Computing infrastructure providers, data processing, web hosting, and related services" modestly shrunk, but "Web search portals, libraries, archives, and other information service" is the only area to grow under information.
This seems different then what the post says. They also said worse then 2008, but didn't post any information. I would imagine the total market was much smaller, so the while total jobs lost was probably smaller, percentage was probably larger. When I started in 2012, tech would take any with a science degree.
I don't understand the job titles being propose in the post, are the using different BLS data then me?
https://www.bls.gov/news.release/pdf/empsit.pdf
It's a weird employment category that includes both Google SWEs and your local librarian.
Be that ideal. The shareholders are counting on you!
Exactly my point. Encourage LLM adoption, faster, faster. Be excited about your homeless future, software engineers!
[cue the POS https://youtu.be/SP-gN1zoI28]
So have all the great engineers I've been working with - there's a deep desire for growth past the things that you're currently good at.
The people worrying they might code themselves out of a job are in a different skill demographic. (Ironically, that means they won't be able to code themselves out of a job)
you’d think, but in my experience, once you reach high salary - in a lot of places - you can coast for a long time with very little output
What exactly do you mean by that? Do you mean you finished one project but your employer had another one for you, which you then were expected to work on instead of sitting idle? Or do you mean you coded yourself into a "promotion"?
My comment was just mocking the foolish selfless ethos of many software engineers, who don't look out for themselves and idealize giving to psychopathic organizations that will screw them the moment that's advantageous. Many software engineers have a pathological level of naivete and confusion about the role they really inhabit (e.g. righteously going on about buggy-whip makers).
The next step is for me to respond with an LLM. Maybe if my LLM is good enough it’ll convince their LLM to skip the interview and just offer me a job.
This chart shows that the rate of year-over-year, month-by-month change is worse than 2020.
But the number of tech jobs has grown by 12% since April of 2020 (2.34M vs. 2.63M). Heck, there are more tech jobs today than at the beginning of 2022 (2.61M), even.
Job market sucks, trend is bad, but post title is a misnomer for what this chart shows.
(Numbers based on a quick grab BLS.gov data of CES6054151101 (Custom Computer Programming Services) + CES5051800001 (Computing Infrastructure Providers, Data Processing & Web Hosting) + CES6054151201 (Computer Systems Design Services)---couldn't find other ones quickly and gave up :))
Could speculate this is likely to be a shift in what gets funded and invested in.
I’ve been looking for work for nearly seven months. I can write low level systems code in C and C++ to web applications in Python and compilers in Haskell. I have tons of industry experience.
Yet most places I apply to ghost me or follow up a month later that the position has been filled.
Companies that have been lying off people claim they are seeing record profits.
It seems like we went from a relatively stable growth to just chaos.
In the mid 2010s, then most notably in late 2020 - 2021, you had people who had no interest in tech entering the industry because they saw it as an easy career to make decent money in.
It got pretty bad in the late 2010s, but it become almost comical in 2021 with people who took a two-week coding bootcamps suddenly landing 6 figure jobs. Some of these people were even working at multiple companies at the same time.
The optimist in me hopes this all shakes out with those people who had no interest in tech moving on to other things. These types of people were not only bad employees, they were also bad for the industry, and in my opinion responsible for culture shift from tech being a place dominating by "nerds" and "geeks" in the 90/00s to the modern "tech-bro" stereotype.
The realist in me though will continue to warn people the tech job they're working at today is likely their last. Between tech industry growth slowing, the excessive over production of tech talent and AI + SASS automating a lot of traditional software development work it's going to be exponentially harder to remain employed in tech in the coming years.
So much so you might as well find a relatively worse paid job if it means you don't have periods of months of unemployment every year.
I remember it was even earlier than that, in the 90s when Bill Gates became the richest man in the world.
You're right about 6 month bootcamps leading to jobs in 2021 though! A true gold rush.
I grinded my 20s away trying to have a successful career and if that just gets pulled out from under me I’ve got absolutely nothing.
While I think a lot more was going on with him than being unemployed, I'm convinced AI hitting the scene had a bit to do with it. They were an older dev 50+.
Too bad I guess.
But as per usual, the bust hit just as hard as the boom. Multiple high profile failures in games and initiatives as a whole, Microsoft and Apple decided to stop bleeding money with their respective subscription deals, mobile gaming (from the advent of Genshin Imapct and co) became less an easy cash grab and more a 2nd wing of AAA development, investments dried up overnight for indies (unless 'AI').
And the headcount, of course: https://variety.com/2026/gaming/news/one-third-video-game-wo...
COVID did weird things to the industry, that's for sure.
There was always a "clock" for junior engineers to prove they could handle the high pressure and high intensity work, and as long as they were meeting the bar, they were safe.
They called on-boarding, "Bootcamp", and was for every engineer, junior to staff, to learn the process and tooling. Engineers were supposed to be empowered to take on whatever task they wanted, without pre-existing team boundaries if it meant they were able to prove their contributions genuinely improved the product in meaningful ways. So, come in, learn the culture, learn the tooling, meet others, and then at some point, pick your home team. Your home team was flexible, and you were able to spend weeks deciding, and even if you selected one, you could always change, no pressure. Happy engineers were seen as the secret sauce of the company's success.
I remember that summer, vividly. They told the folks in Bootcamp, pick your home team by the end of the week, or you will be stuck in Bootcamp purgatory. At the same time they removed head count from teams, ours went down to a single one. A new-grad, who had literally just arrived that Monday, picked our team on Tuesday, and then had to watch as most of their fellow Bootcamp mates got left behind.
People wondered what would happen to them for weeks, and then, just like that, the massive layoff sent them all home. It was shitty because from where I sat, it was basically a slot machine. Anyone of the folks in Bootcamp were just as capable, but we had one seat, and someone just asked for it first.
That being said, I don't think it's unfair to point out that creating a massive influx of new developers without jobs that provided good mentorship (most jobs are awful at mentoring junior developers) is going to have huge consequences that we're now dealing with. I think the "learn to code" thing was a massive mistake. Encourage the people that want to, sure, but don't try to pull people in that are only marginally interested in a paycheck.
Not that I disagree with you here, but it is hard to square this with people who are also saying not to worry about AI displacement because there's limitless demand for software.
Well, that's easy to square: the idea that there is limitless demand for software is nonsense. Pure fiction
What is software publishers category? As it seems it’s picking up while Computer system design is the largest negative impact.
I would appreciate if there was a better chart explaining sort of roles and locations that had the largest impact
Any commentary about tech jobs that does not include the interest rate environment and the massive over hiring that occurred between 2019 and 2022 is borderline dishonest.
Look at federal data of SWE job postings and look at the federal funds rate for the same period. Jobs is giant mountain peaking in ‘22. Interest rate is zero for the pandemic and then spikes right when SWE jobs start to collapse.
Tech hiring is all downstream of interest rates. AI has had almost no impact, at least not yet. (Block layoffs were not AI, look at their stock, they basically can only succeed as a financial company when money is free, very misleading and a convenient excuse for terrible management to now say they need to be “AI native”)
Should be something like tech employment rate of growth is lower than it was in 2008 or 2020.
There are many many more tech workers than at either of those points.
Bottom line, if programmers are fucked, so is just about anyone else.
Signal v noise ratio so much higher in hardware, nobody performatively studies mechanical engineering to make $60k in ohio.
The other thing is it's showing first derivative, not absolute numbers, which is a very questionable way to derive "worst employment situation" in a field that has been on world-changing boom over the last 50 years.
https://x.com/JosephPolitano/status/2029916369056079975
The question that I have for this data though is that its showing the derivative - the change each year in hiring.
The dot com crash is clear and very visible in there. The global financial crisis is also a dip in there (I'm saving this for when people claim the number of jobs lost compared to the dot com crash).
From 2010 to 2020, there was a fairly steady linear growth of employment. There was the dip in 2020, but 2020 to 2024 had a much higher peak. My "I want to know about the data" is "is the area above +150k jobs from 2020 to 2024 greater than the area below 0 from 2024 to 2026?"
I was able to find the following:
- Software Publishers https://fred.stlouisfed.org/series/SMU06000005051320001
- Computing Infrastructure Providers https://fred.stlouisfed.org/series/CES5051800001 - Computer Systems Design https://fred.stlouisfed.org/series/CES6054150001 - Web Search Portals https://fred.stlouisfed.org/series/CES5051900001 - Streaming Services https://fred.stlouisfed.org/series/SMU06000005051620001 I wasn't able to find the following: - Custom Computer Programming ServicesThere are numerous open questions in this analysis which I would need to be addressed before drawing any conclusions. My gut feeling would love to accept it at face value but I never trust my gut.
I haven't heard from a recruiter in probably 6 months. I recently put my feelers out and applied to a handful of positions I was qualified for, and got rejection letters from all of them.
There are also a lot of people posting fake jobs for feeding LLM datasets, running scams, and bidding down labor costs. =3
I have no idea about what's coming, but I wouldn't pay a whole lot of attention to people who are looking at the plots of a highly volatile and cyclic industry that goes through constant boom-and-bust cycles, and are trying to position this as proof that AI is or isn't having an impact.
You're really playing loose with the Venn diagram here.
Only "some" Amazon SWEs would "code JavaScript (poorly)."
The most recent one few months ago and I passed it with great score, top 5% of candidates etc but that wasn't enough to get me hired.
Terrible market, i'm at my wits end to even how to approach this.
Tech was and still is the easiest way to make 200k base salary, before even thinking about the stock.
We need a reset and anyone who can’t make it can go fill the jobs we need in construction, education, etc.
What mediums are you using for recruiters to contact you? Do you have a linked-in or are you applying directly to recruiting companies? Are you active anywhere else?
Genuinely interested in how you're receiving so many recruitment emails. That used to be my go to way to hit the job market.
The "inshoring" hubs like RTP and Denver or those hubs that are dominated by a handful of oversized companies like Seattle are the worst impacted.
* Since I hit the pavement in late January, I've tracked 100 job applications
* Of those 100, only 7 have turned into interviews
* Of those seven interviews, 3 turned into second-round
* ~50% of all applications never receive a response
* ~20% of rejections for any reason have the role re-posted within thirty days
* For rejections stating "higher quality applications", that role re-post rate is closer to 50%, suggesting ATS systems culling too many candidates to fill the role or ghost jobs
* Despite my state requiring salary requirements be posted in the JD, only around 70% of postings included what could be considered "reasonable" estimates
* 100% of interviews have been for local employers requiring 3+ days on-site
And now, some observations not captured in the data directly:
* Employers are trying to "under-title" folks; Senior roles want to hire former Leads, and Management roles want next-rung candidates for prior-rung titles (e.g., hiring what should be a Senior Manager for an entry-level management role)
* Employers are also trying to underpay workers by a large margin, especially folks coming from Big Tech ("We don't pay {SV_FIRM} money" while offering salaries below the local 50%ile for the role in question); they're blaming a "surplus of tech talent", which may or may not be true (I lack the data to prove either way)
* The two above points are in conflict, because rent/mortgages in these areas are so steep that even with major lifestyle changes to cut costs, these wages simply aren't survivable for local areas
* "Credential Creep" is back in force: Architect certs required for mid-level engineering roles, buzzwords prioritized over outcomes and achievements, and AI ATS' rejecting qualified candidates flat-out
* College Degrees are relevant again as a means of pruning candidates; fifteen years of experience is irrelevant for a lot of Senior roles if you don't have a BS or Masters, which wasn't the case even last year
* Industry-specialization is also back, even for roles where industry specialization is generally moot or easily picked up (e.g., Corporate IT stuff)
* A significant number (~75-85%) of roles explicitly reject H1B and other visa workers; not a problem for me (Citizen), but this is the worst possible time to be job hunting on a non-LPR status.
And now, my personal experiences:
* There's a very strong attitude of "you're being entitled" when it comes down to salary negotiations, even when you show your math for essentials - and share prior compensation history reflecting the cuts you've already taken since your Big Tech salary to "rejoin the market".
* Employers generally have no clue how expensive it is to live right now, especially in major metros; one such employer who balked at my comp floor genuinely had no clue the median rent was three and a half grand per month.
* Compensation seems particularly tilted towards working couples; as in, neither alone makes enough to survive, and employers assume you have a FTE spouse to shore up finances so they can pay you less
* Employers also don't seem to know what they actually want or need. Specialist Engineer roles (e.g., Cloud Engineer, Network Engineer) cite required experience and expertise with the full technology stack inclusive of ERP and HRIS nowadays, which is something that used to be handled by a specific team for the entirety of my career thus far, even in smaller (<1k) orgs. I've also seen Architect roles demanding Help Desk work, and Software Dev roles who want experience supporting Entra.
* AI does not feature in as many interviews as I would've thought. The few times it does, it's very much a "that's nice, but we're taking a wait and see approach" attitude
* There's a lot of eagerness to hire domestically again (I think even middle managers were tired of outsourcing or offshoring), but a lack of budget to afford domestic talent.
Ultimately, it's pretty bleak - but still better than last year, at least thus far (~300 apps, ~2 companies interviewed with, 1 offer in 2025). AI isn't the value-add I was sold on by career counselors and LinkedIn (huge surprise there /s), and there definitely seems to be the appetite to hire, but not the realism of what to expect or how much it'll cost. I very much view it as a sort of tug-of-war at the moment, between workers who did everything expected of them and have cut to the bone already, and employers who somehow think they can pay <50%ile wages while mandating 4-days on-site in a major metro for experienced talent.
If you're an employer looking to hire, I have some advice:
* Ditch the AI ATS or AI summaries and read resumes, especially if you're requiring local presence.
* Understand what you need (and what that will cost you) before posting the JD
* Understand the local cost of living, and budget accordingly (i.e., if your Senior Engineer can't afford median rent, they're not going to stick around when things improve)
* If you value loyalty and aren't paying TC to afford a median home in the area, then you don't actually value loyalty
* Don't pigeonhole yourself with hyper-specific candidates as a means of winnowing down applicants; that level of specialization will flee the second they get a better offer elsewhere
* Post salaries in the JD, required or not, so you don't waste your time with candidates whose expectations don't align with your budget
I don't want to sound like it's all a horror show though, I've had some interviews that have gone well with companies being sensible, so I think there's good stuff out there. But it's overall a rough market.
I've also run into the industry specialization roadblock a few times. Got turned down by a fintech company after multiple interview rounds because I did not have banking industry experience, for example. I guess I get it as a tie breaker but I've operated in a PCI compliant environment for years, seems like that should count as relevant experience? Also if you're going to dumpster candidates without banking experience why on earth did you waste several hours of your staff's time giving me tech screens?
Job hunting has always sucked. But it feels particularly busted at the moment. The process is miserable. If you've coasted to an easy hiring in the last year, you're either amazing (and hats off to you!) or got very lucky.
My example of that was when I applied for an Architect role (as I'm at that point in my upward career trajectory), and they asked me instead to apply for a Senior Admin role as they "didn't know what the Architect role would look like yet". I did, I included my comp target, and got the hard sell on why I was being unreasonable and should take {2016_PAY}/{$100k below SV_FIRM} instead. I mentioned my absolute floor was {$75k lower than SV_FIRM}/{$25k lower than my target}, ran him through my math (median rent for the area, on-site expectations, commute costs, food costs, insurance costs, 50/30/20 budgeting, etc), and pointed out that floor would only cover needs (50) and savings (20) with no fun money (30) whatsoever. Ultimately I withdrew my name entirely because the guy just wouldn't listen to me, and all but demanded I be grateful for his number in the current economy.
I suspect something similar is going on with another company that's seemingly ghosted me, after I stated I was targeting their upper boundary of their listed comp range - still $85k below {SV_FIRM}, but with growth potential towards Architect and Director-type IT roles. Even when I'm fine eating huge pay cuts for work (and falling off the homebuying ladder, as not even {SV_FIRM} paid house-purchasing money), the employers out there really do want perfect diamonds for the cost of Halloween Trinkets.
> Also if you're going to dumpster candidates without banking experience why on earth did you waste several hours of your staff's time giving me tech screens?
This is also something that's grinding my gears. Had an investment firm put me through six technical interviews with glowing recommendations every step of the way only for the seventh round (CIO) to put the kibosh on it without a reason and after showing up unprepared and disinterested. Also had companies say I lack financial discipline experience when I've literally built models, showback systems, budget forecasts, and cemented six-figures of monthly savings in prior roles; same with companies saying I "lack compliance experience" despite calling out running infra in highly regulated environments, performing compliance audits for clients, and uplifting infra to satisfy compliance regimes.
If I didn't know better, I'd say the entire HR process is just feeding shit into chatbots and letting them make hiring decisions. Nobody seems to actually care about the humans involved or the wider systems at play.
It's immensely frustrating, but I can only keep on keeping on until something changes. I don't need to win every application, I just need to win one.
We could arrive at the technical singularity and come up with 8000 IQ robots that can do things in a clean room but in the messy physical reality? I believe they will fail to catch up forever.
They will fail to deal with a stripped bolt head deep inside an engine bay that's been exposed to 40 years of road salt, that needs to be hit right with a 10lb hammer and a home made chisel until shit knocks loose, combined with cutting, welding, drilling, torching, tapping, impromptu redneck engineering, cursing, the use of 8 different kinds of penetrating lubricants, the acquisition of weird and highly model-year specific parts in a junkyard 500 miles away, realizing it's all wrong and doing it again.
Multiply the complexity by 100 times and that's what it's like to take on a classic car project.
Short term I'm freelancing and doing whatever else I can find to get by. Hoping for one more full time role before I start my self published ventures.
---
As I also mentioned, the only way you can survive in American tech at this point is to:
1. Move to a Tier 1 tech hub like the Bay and NYC. If you get laid off, you will probably find another job in a couple of weeks due to the density of employers. Seattle used to be a good option, but WA's norms around noncompete clauses incentivize larger employers which reduces the ability for startups to truly scale.
2. Start coming into the office 2-3 days a week. It's harder to layoff someone you have had beers or coffee with. Worst case, they can refer you to their friends companies if you get laid off
3. Upskill technically. Learn the fundamentals of AI/ML and MLOPs. Agents are basically a semi-nondeterministic SaaS. Understanding how AI/ML works and understanding their benefits and pitfalls make you a much more valuable hire.
4. Upskill professionally. We're not hiring code monkeys for $200K-400K TC. We want Engineers who can communicate business problems into technical requirements. This means also understanding the industry your company is in, how to manage up to leadership, and what are the revenue drivers and cost centers of your employer. Learn how to make a business case for technical issues. If you cannot communicate why refactoring your codebase from Python to Golang would positively impact topline metrics, no one will prioritize it.
5. Live lean, save for a rainy day, and keep your family and friends close. If you're not in a financial position to say "f##k you" you will get f##ked, and strong relationships help you build the support system you need for independence. The reality is the current set of layoffs and work stresses were the norm in the tech industry until 2015-22. We live in a competitive world and complaining on HN does nothing to help your material condition.
[0] - https://news.ycombinator.com/item?id=47174561
For people that can’t/dont want to move to the “hubs”, just know that there is absolutely still a career path. I will say though that you need to have above average communication skills and proactively build relationships during in person off-sites.
It also requires a level of maturity, clear thinking, self-starterness, and independence that is hard to come by without a proven track record and experience.
My advice is for the median/average SWE and HNer, not for the truly exceptional.
Spending a 7-10 years in a hub and then going remote first is the best path because you build the network you need to get referrals to vouch for you as a remote-first hire well as the track record needed to go remote-first.
Its also nothing new; new grads gravitated towards these hubs anyway. Previously, they would settle down in the burbs. Now they're migrating anywhere in the US.
1. You start off by labelling both platforms as "extreme partisan" - care to explain? 2. This charge is used to minimize the original complaint (login requirement), which is a hard blocker to view replies, i.e. additional context. 3. This all then somehow morphs into a point about platform longevity?
How exactly does any of this address parent commenter's statement that "bsky is just a superior viewing experience."?
Too late to edit / delete my post but I retract it and apologize.
I get rejection for every single position that I apply to. Often I read literally a description of myself in job posting, and it's still "we decided to no move forward".
At the same time, I hire people and see that 8/10 candidates are just trash. Not in the sense they "are not aligned", or "emit wrong vibes", or other bs. They literally can't write a single line of code, on their own laptop, in their own IDE.
Make it make sense.
One low point was an interview with a guy who connected with his current work laptop and couldn’t find the $ key on it for a basic scripting question.
I can’t make it make sense either.