I'm at the opposite end. I feel AI is sucking all the joy out of the profession. Might pivot away and perhaps live a simpler life. Only problem is that I really need the paycheck :(
Yup. I worked very hard, and for many years to acquire a skill in designing and writing systems. It is an art. And it is very disheartening to see people without any skills to behave the way they do. For now, the work I do cannot be replicated by these people, but I do not such high hopes for the distant future. Though at the point it can truly be automated I think it will be automating a large majority of non physical jobs (and those too will be likely getting automated by then)
I feel it's nice to use AI coding for side-projects, especially after work when I am kind of tired. Although the one issue is that if it gets stuck in a loop or just does not get the what is wrong and does the wrong thing no matter how you twist it, then you have to go into the weeds to fix it yourself and it feels so tiresome, at that point I think what if I had just done everything myself so my mental model would be better.
Also we are still designing systems and have to be able to define the problem properly, at least in my company when we look at the velocity in delivering projects it is barely up since AI because the bottlenecks are elsewhere..
I think what we currently have is pretty close to the ceiling for LLMs. But with the amount of money being spent there might be a new breakthrough (not llm)
Vibecoder here. I don't think so. I am a PE investor, and we are using it in our small portfolio companies to great effect. We can make small little mini-apps that do one thing right and help automate away extra work.
It's a miracle. Simply wouldn't have been done before. I think we'll see an explosion of software in small and midsize companies.
I admit it may be crappy software, but as long as the scope is small - who cares? It certainly is better than the janky manual paper processes, excel sheets, or just stuff in someone's head!
I think the parent is talking about the people who post to LinkedIn that "SWE as a profession is dead" non-stop. I fully agree with you that it massively lowered the cost to create, but I'd argue that the people who's saying that SWE is dead wouldn't be able to go past the complexity barrier that most of us are accustomed to handling. I think the real winners would be the ones with domain expertise but didn't have the capacity to code (just like OP and you).
Correct. I think "real" software requires real development and architecture.
And to be honest, even the tiny apps I'm doing I wouldn't have been able to do without some background in how frontend / backend should work, what a relational database is, etc. (I was an unskilled technical PM in the dotcom boom in the 2000s so at least know my way around a database a little. I know what these parts of tech CAN do, but I didn't have the skills to make them do it myself.)
Curious about why the janky manual paper processes, excel sheets, or stuff not documented, was fixed only when vibe code was available. Was it just cost?
Time and thus cost. Early in my career I would look across a fairly large company at processes being ran on spreadsheets and see if it would be worth the time to create software to address and if those processes should be standardized. We barely scratched the surface with all the possible custom software opportunities for this company.
Cost and managerial overhead. We don't have a dev on staff. Even if we did, there is lots of managerial overhead to explain "the problem" and then iterate to a solution with a dev. Now you can just build the damn solution yourself!
1. Invoice billing review. Automated 80% of what was a manual process by providing AI suggestions in an automated way. Saved 3 hours per day of managers time. Increased topline by 10%. Dev time: 1 day
2. Data dashboards. We use janky saas that does not have APIs. Automated a scraper to login, download the reports daily, parse and upload to a database, and build a dashboard. Used to take my associate 3 hours per week to do this in a crappy spreadsheet. Now I have it in a perfect database much more frequently. Dev time: 4 hours.
We are attacking little problems all across the business now.
yep, how do we define AI as a replacement for search engine, and templating engine, and inference engine (do X in Y)?
is there a term for that?
AI at our fingertips, accessible and useful, that's just a tool, that's not redefining us as an industry and denying people's jobs – that's an asset. (I used an em dash to prove I am not AI, as apparently double dash is now a sign of AI text!)*
Agree 100%; and the analogy with SEO is spot on! Those were everywhere 20 years ago. They're mostly gone, and so are their secret recipes and special tags and whatnot. AI gurus are the same! Not the same people but the same profile. It's so obvious.
"Comment NEAT to receive the link, and don't forget to connect so I can email you" -- this is the most infuriating line ever.
I'd recommend a pivot to hardware. I'm in the FPGA sector, and vibe coding isn't a thing for the most part, simply because the determinism required doesn't lend itself well to LLMs. It's so incredibly easy to introduce a bug at every single step, and the margin for error depending on volumes is near zero. You're often playing with a single clock cycle of headroom. I've yet to play with a single LLM (Claude Opus 4.5 is my latest trial) that doesn't introduce a massive amount of timing errors. Most semiconductor IP is proprietary, top-level secret, code never leaves the building. The data to build good models just isn't there like it is for software and the open-source ecosystem.
In comms, they have something like a 1:4 ratio of design to validation engineers. Defence is slightly different, as it depends on the company, but generally the tolerance for bugs is zero. Lets not get started on the HF trading folks and their risk appetite!
There's a lot of room for software engineers. Most FPGAs are SoC devices now, running some form of embedded linux doing high-level task management networking. Provided you know enough Verilog to know your way around, you'll be fine. You're also in a space where most engineers I know are preparing to retire in the next 5-10 years, so there will be a panic which will ripple across industries.
I don't get this sentiment, regressions still exist, you can't just prompt them away and a programmer will spend 10x more time fixing regressions, bug fixing and improvements than scaffolding in most projects that people pay for. If most of your time at work is not doing this, then you are already living a simple life.
Consider security engineering. It requires constantly thinking about unconventional ways to attack systems, and taking advantage of common coding mistakes LLMs produce as often is humans because it learned from humans.
Security engineers will have jobs until software is perfectly secure... and that is going to be a while.
I do not use LLMs at all to do my job, and it is unlikely I ever would. Clients pay me -after- they had all their favorite LLMs take a pass.
I feel the same way. The only way I found that lets me cope with this is by having 1-2 personal projects, closed source, with me as the only user, where I slowly build things the way I enjoy, and where the outcome is useful software that doesn't try to monetise at the expense of the end user.
I quit my job over AI. Just felt like my job was approving pull requests where both the PR and the code itself was just slop. In all fairness, it was mainly CRUD applications so not a big deal but in the end I didn't feel like I had any control over the application anymore with hundreds of lines of slop being added every day.
One day I might start a consultancy business that only does artisanal code. You can hire me and my future apprentices to replace AI code with handcrafted code. I will use my company to teach the younger generation how to write code without AI tooling.
That's an interesting perspective. I guess it depends on what you want and how low the stakes are. Artisanal coffee, sure. Artisanal clothing, why not? Would you want an artisanal MRI machine? Not sure. I wouldn't really want it "hand crafted", I just want it to do it's job.
yup. the things i disliked most about programming were hyped up bullshit and losing autonomy.
These existed before but the culture surrounding AI delivered a double dose of both.
I have no problems with LLMs themselves or even how they are used but it has developed its own religion filled with dogma, faith based reasoning and priests which is utterly toxic.
The tools are shoved down our throats (thanks to the priesthood, AI use is now a job performance criteria) and when they fail we are not met with curiosity and a desire to understand but with hostility and gaslighting.
Happy for everyone who enjoys it. For me it's the opposite: AI everywhere sucks the joy out of it and I'm seriously starting to consider a career shift after roughly 10 years of writing code for a living.
I feel you. There's a massive difference between crafting and assembling. AI turns us from artisans carving a detail into assembly line operators. If your joy came from solving algorithmic puzzles and optimizing loops, then yes, AI kills that
It might be worth looking into low-level dev (embedded, kernel, drivers) or complex R&D. Vibe coding doesn't work there yet, and the cost of error is too high for hallucinations. Real manual craftsmanship is still required there.
The cost of hallucinations though - you potentially have a stronger point there. It wouldn’t surprise me if that fails to sway some decision makers but it doesn’t give the average dev a bit more ground to work with.
I'm starting to think that people don't want to be programmers anymore, they want to be managers who delegate their work to someone or something else, and then come back, critique the work, and do another loop
I'm thinking back to my contracting days when a typical customer might have a team of ten people but only one or two did the bulk of the work. Now the whole team can be productive for whatever measure you use for productivity.
It's not so great for the one or two but fantastic for everybody else.
I guess I'm an outlier then because I actually like programming, and I've never wanted to be a manager, even a manager of an LLM. At least half the fun of making software is doing the programming
It sucks the joy out of it because to the extent that you build something with AI, (Obama voice) you didn't build that. I am allergic to the concept of developing with AI, especially for personal work, because AI-authored code isn't something I built, it's something I commissioned. It's like if I went onto Fiverr or Upwork with a spec and paid money and said "Here, build this" to a freelancer and then went back and forth with that person to correct and refine the result. I might get a halfway decent result in the end, but I don't get the experience of solving the problem myself. Experience solving problems yields new insights. It's why math textbooks have exercises: the only way to grasp the concepts is to solve problems with them.
With AI, you are no longer a developer, you're a product manager, analyst, or architect. What's neat about this, from a business perspective, is that you can in effect cut out all your developers and have a far smaller development workforce consisting of only product managers, analysts, and architects whom you call "developers" and pay developer salaries to. So you save money twice: once on dev workforce downsizing, and again on the pay grade demotion.
The problems I've been working on are at a much higher level than the nuts and bolts.
I'm currently exploring domain-specific languages aimed at writing web applications. I've been particularly interested in, much like bash, data flowing through pipelines. I have spent quite a bit of time and I'm definitely not vibe coding but I've probably only writen 1-2% of the code in these projects.
It is so much work to build out a new language with a surrounding ecosystem of tooling. Not even five years ago this would have necessarily been a full time multi-year endeavor or at least required a team of researchers. Now I can tinker away in my off hours.
No need to go that far. I bounced off weekend projects many times because I lost interest the moment I had to relive fighting the "modern" frontend ecosystem set up (or whatever else unrelated to the actual building), which is what I was already doing at the day job. In the end I just gave up because I'd rather get some rest and fun out of my time off. Now I can just skip that part entirely instead of tanning in front of <insert_webpack_or_equivalent> errors for hours on Saturday afternoon.
Huh? What about all the open source software you use, did you build all of it?
What about the phone in your hand, did you design that?
HN loves to believe they are the noble few - men and women of math and science, driven by nothing but the pure joy of their craft
But this whole AI thing has been super revealing. Almost everyone here is just the same old same old, only that now that the change is hitting close to home, you’re clutching your pearls and lamenting the days when devs were devs
The younger generation born into the AI world is going to leave you in the dust because they aren’t scared of it
My math teacher used to say that people felt this was about…calculators, imagine that
There seems to be two camps of people: those who love the coding and those who love delivering value/solutions. I am in the latter camp. The happy consumer and the polished product is what gives me satisfaction, the code is just really a vehicle from A to B. It’s a shame for anyone in the first camp who wants a career.
This is such marketing speak. The words mean nothing, they’re just a vague amalgamation of feelings. “Vibes”, if you will.
If you “love delivering value and solutions”, go donate and volunteer at a food bank, there’s no need for code at any point.
> The happy consumer and the polished product
More marketing speak. If you are using LLMs to write your code, by definition your product isn’t “polished”. Polishing means pouring over every detail with care to ensure perfection. Letting an LLM spit out code you just accept is not it.
The word you’re looking for is “shiny”, meaning that it looks good at a glance but may or may not be worth anything.
What term would you use? You can't say "a finished product" because it may never be finished, but something that other people find valuable seems like a good definition.
I get the argument. Sometimes I really enjoyed the actual act of finally figuring out a way to solve a problem in code, but most of the time it was a means to an end, and I'm achieving that end far more often now via AI tooling.
I’m not fussed about the exact term, as long as it points to something real and at semantic equal footing with the alternative.
Note how they described two areas of focus (what you “love”): “coding” and “delivering value/solutions”.
You can be a “coder” or a “programmer”, no one is a “deliverer of value/solutions”.
“Coding” is explicit, it’s an activity you can point at. “Delivering values/solutions” is vague, it’s corporate speak to sound positive without committing to anything. It doesn’t represent anything specific or tangible. It doesn’t even reference software, though it’s what it is, to make it sound broader than what it is. You could say “using and releasing apps”, for example, thought proponents may feel that’s reductive (but then again, so is “coding”).
Again, what’s in contention here isn’t the exact term, but making sure it’s one that actually means something to humans, instead of marketing speak.
Nonsense. Features are requested from me, I deliver them to the customer, the customer is happy and pays me. I deliver solutions and the customer deems them to be value for their business... What else am I supposed to call that?
I'm extremely diligent around vetting all code in my repo's. Everything is thoroughly tested and follows the same standards that were in my codebase before the invention of LLM's. I'm not "vibe coding". You're making assumptions because of your negative emotional reaction to LLM's.
> This is such marketing speak. The words mean nothing, they’re just a vague amalgamation of feelings. “Vibes”, if you will.
I actually think this reveals more about you than you might realise. A _lot_ of people enjoy being able to help people resolve problems with their skills. Delivering value is marketing speak, but it's specifically helping people in ways that's valuable.
A lot of people who work in software are internally motivated by this. The act of producing code may (or may not be) also enjoyable, but the ultimate internal motivation is to hand over something that helps others (and the external motivation is obviously dollars and cents).
There is also a subset of people who enjoy the process of writing code for its own sake, but it's a minority of developers (and dropping all the time as tooling - including LLMs - opens development to more people).
> If you are using LLMs to write your code, by definition your product isn’t “polished”. Polishing means pouring over every detail with care to ensure perfection.
You can say the same thing about libraries, interpreters, OSes, compilers, microcode, assembly. If you're not flipping bits directly in CPU registers, your not pouring over every little detail to ensure perfection. The only difference between you and the vibe coder who's never written a single LoC is the level of abstraction you're working at.
Edit:
> If you “love delivering value and solutions”, go donate and volunteer at a food bank, there’s no need for code at any point.
I also think this says maybe a lot about you, also, as many people also donate their time and efforts to others. I think it may be worth some self-reflection to see whether your cynicism has become nihilism.
I have spent over a decade working primarily on open-source, for free. I still do it, thought it’s no longer my primary activity. A huge chunk of that time was helping and tutoring people. That I still do and I’m better at it; I still regularly get thank you messages from people I assisted or who use the tools I build.
I did use to volunteer at a food bank, but I used that example only because it’s quick and simple, no shade on anyone who doesn’t. I stopped for logistical reasons when COVID hit.
I have used the set of skills I’m god at to help several people with their goals (most were friends, some were acquaintances) who later told me I changed their life for the better. A few I no longer speak to, and that’s OK.
Oh, and before I became a developer, I worked in an area which was very close to marketing. Which was the reason I stopped.
So yeah, I know pretty well what I’m talking about. Helping others is an explicit goal of mine that I derive satisfaction from. I’d never describe it as “delivering value/solutions” and neither would any of the people I ever helped, because that’s vague corporate soulless speech.
I don’t think they should’ve done that or continue to do it without consent, and I don’t limit that to code. Books, images, everything else applies the same.
I also don’t think “but it wouldn’t be viable otherwise” is a valid defence.
I don’t see what that has to do with the conversation, though. If your point is about the free/$20, that doesn’t really factor into my answer.
It’s not marketing speak, but it’s rarely 100 percent one or the other.
> More marketing speak. If you are using LLMs to write your code, by definition your product isn’t “polished”.
This doesn’t make any sense. Polished to who? The end user? You can absolutely use AI to polish the user experience. Whether coding by hand or AI the most important aspect of polish is having someone who cares.
> ...and those who love delivering value/solutions. I am in the latter camp. The happy consumer and the polished product is what gives me satisfaction...
Can't the customer now just skip you and generate a product for himself via AI?
Serious? Have you used an LLM? Of course they couldn't... LLM's speed up my development velocity. Maybe 1.5x-2x? Hard to measure. You still need the knowledge to make smart decisions, enforce sensible/maintainable architecture & patterns, etc. How is a regular person going to review code to make sure it's correct/efficient/safe?
Agree with those 2 camps. The latter camp is all cheered up which is nice, but they should be asking the question if their solution is valuable enough to be maintained. If so, you should make all generated code your code, exactly in the form it needs to be according to your deep expertise. If not, congratulations, you have invented throw-away code. Code of conduct: don't throw this code at people from the former camp.
Or to phrase it more succinctly: if you are in camp 2 but don't have the passion of camp 1, you are a threat for the long term. The reverse is dangerous too, but can be offset to a certain extent with good product management.
> If so, you should make all generated code your code, exactly in the form it needs to be according to your deep expertise.
This is solved problem with any large, existing, older code base. Original writers are gone and new people come on all the time. AI has actually helped me get up to speed in new code bases.
> If so, you should make all generated code your code, exactly in the form it needs to be according to your deep expertise.
Is this also true of all third party code used by their solution? Should they make all libraries and APIs they use their own in exactly in the form it needs to be according to their deep expertise? If not, why not?
If so, does this extend to the rest of the stack? Interpreters, OSes, drivers? If not, why not?
Well, what if one becomes unmaintained or has issues that only affect your project. Why is that uncontrolled code different to generated code? Is it specifically that it's generated?
This isn't a trick question, BTW. It's a genuine attempt to get to the rationale behind your (and the GP's) stance on this.
In particular, the GP said:
> Or to phrase it more succinctly: if you are in camp 2 but don't have the passion of camp 1, you are a threat for the long term.
That hints I think at their rationale, that their stance is based on placing importance on the parts of software development that they enjoy, rather than any logical basis.
> Well, what if one becomes unmaintained or has issues that only affect your project.
This happens, but very rarely compared to changes in your own code base. If a library breaks, you can usually find an alternative, but even in that case you need to know how to modify your own code.
The difference with generated code is that you are tasked to maintain the generated code.
I think for a lot of minor things, having AI generate stuff is okay, but it’s rather astounding how verbose and sometimes bizarre the code is. It mostly works, but it can be hard to read. What I’m reading from a lot of people is that they’re enjoying coding again because they don’t have to deal with the stuff they don’t want to do, which...I mean, that’s just it isn’t it? Everyone wants to work on what they enjoy, but that’s not how most things work.
Another problem is that if you just let the AI do a lot of the foundational stuff and only focus on the stuff that you’re interested in, you sometimes just miss giant pieces of important context. I’ve tried reading AI driven code, sometimes it makes sense, sometimes it’s just unextensible nonsense that superficially works.
This isn’t tech that should replace anything and needs to be monitored judiciously. It can have value, but what I suspect is going to happen is we are going to have a field day with people fixing and dealing with ridiculous security holes for the next decade after this irrational exuberance goes away. It should be used in the same way that any other ML technique should be. Judiciously and in a specific use case.
Said another way, if these models are the future of general programming, where are the apps already? We’re years into this and where are they? We have no actual case studies, just a bunch of marketing copy and personal anecdotes. I went hunting for some business case studies a while ago and I found a Deloitte “case study” which was just pages of “AI may help” without any actual concrete cases. Where are the actual academic studies showing that this works?
People claiming AI makes them code faster reminds me that Apple years ago demonstrated in multiple human interaction studies that the mouse is faster, but test subjects all thought keyboard shortcuts were faster [1]. Sometimes objective data doesn’t matter, but it’s amusing that the whole pitch for agentic AI is that it is faster and evidence is murky for this at best.
If you really want to deliver polished products, you still have to manually review the code. When I tried actually "vibecoding" something, I got exhausted so fast by trying to keep up with the metric tons of code output by the AI. I think most developers agree that reviewing other people's code is more exhausting mentally than writing your own. So I doubt those who see coding as too mentally straining will take the time to fully review AI written code.
More likely that step is just skipped and replaced with thoughts and prayers.
I do manually review. I don't think the quality of my output has reduced even slightly. I'm just able to do much more. I deliver features more quickly, and I'm making more money, so of course I'm happy. If there was no money in programming I wouldn't be doing it, I think that's the major distinction. I barely have any understanding of how a CPU works, I don't care. I build stuff and people are very happy with what I build and pay me money for it...
This false dichotomy comes up from time to time, that you either like dicking around with code in your basement or you like being a big boy with your business pants on delivering the world's 8000th online PDF tools site. It's tired. Please let it die.
There are people who would code whether it was their career or not, I'm not one of those people. I fell into software development in order to make money, if the money stopped then I would stop. I love building and selling products, if I can't do that then I have no interest in programming. I'm interested in machines, CPU's, etc. I'm interested in products, liaising with customers, delivering solutions, improving things for users, etc. You think there is no distinction there? Again, there are people who code for fun, I'm simply not one of them...
I like using my software engineering skills to solve people's problems. I don't do coding for it's own sake - there's always a thing I'm trying to implement for someone.
As a professional, your job is to deliver value and solutions. It used to be that you could do this by writing code. AI changes this calculus because if the machine can write the code instead, the value you deliver by writing it yourself is greatly diminished.
I've also noticed a kind of grouping like this. I've described them as the "Builders" and the "Solvers". Where the former enjoys the construction aspect of the code more, and the latter enjoys the problem/puzzle-solving aspect of code more. I guess it's more of a scale than a binary, since everyone's got a bit of both, but I think I agree that AI is more fun for the builders.
Same here. Farmer now, former network engineer and software project lead, but I stopped programming almost 20 years ago.
Now I build all sorts of apps for my farm and organizations I volunteer for. I can pound out an app for tracking sample locations for our forage associations soil sample truck, another for moisture monitoring, a fleet task/calendar/maintenance app in hours and iterate on them when I think of features.
And git was brand new when I left the industry, so I only started using it recently to any extent, and holy hell, is it ever awesome!
I'm finally able to build all the ideas I come up with when I'm sitting in a tractor and the GPS is steering.
Seriously exciting. I have a hard time getting enough sleep because I hammer away on new ideas I can't tear myself away from.
100% this. This is the new age of software - but it will be tiny little apps like this for each little user. They don't need to be mega apps, etc. Bespoke little apps that help your own little business or corner of the world.
I'm teaching my kid what I consider the AI dev stack: AI IDE (Antigravity for us), database (Supabase for us with a nice MCP server), and deployment (Github and Vercel for us). You can make wonderful little integrated apps with this in hours.
love to hear about what tech is like on farms today. do you run into the problems with fixing tractors and equipment and its all locked down with drm and you cant fix it without hacking the software?
Slightly moving into the other direction, after 17 years of science and tech optimism I see myself turning into a Luddite more and more.
First observation was that communication and social aspects of software seems crucial for success and proliferation.
And next came: that technology seems inept to solve any socio-econimic problems, but rather aggravates them.
I, on the other hand, am getting gradually, but strongly, disillusioned, and importantly also feeling disenfrenchised, from coding and the world around it.
I never stopped developing but I find myself taking on a lot more side projects than I used to. The cost for doing those just dropped significantly. This enables me to prototype and pursue things that I previously wouldn't have.
I'm also now dealing with things that previously would have taken me too long to deal with. For example, I'm actually making a dent in the amount of technical debt I have to deal with. The type of things where previously I maybe wouldn't have taken a week out of my schedule to deal with something that was annoying me. A lot of tedious things that would take me hours/days now can get done in a few prompts. With my bigger projects, I still do most stuff manually. But that's probably going to change over the next months/year.
I'm mainly using codex. I know a lot of people seem to prefer Claude Code. But I've been a happy ChatGPT Plus user for a while and codex is included with that and seems to do the job. Amazing value for 20$/month. I've had to buy extra credit once now.
The flip side of all this is that waiting for AI to do it's thing isn't fun. It's slow enough that it slows me down and fast enough that I can't really multi task. It's like dealing with a very slow build that you have to run over and over again. A necessary evil. But not necessarily fun. I can see why a lot of developers feel like the joy is being sucked out of their lives.
Dealing with this pain is urgent. Part of that is investing in robust and fast builds. Build time competes with model inference in the time stuff takes. And another part is working on the UX of this. Being able to fork multiple tasks at once is hugely empowering. And switching between editing code and generating code needs to get more seamless. It feels too much like I'm sitting on my hands sometimes.
Creating a polished, usable app is just so much work, and so much of it isn't fun at all (to me). There are a few key parts that are fun, but building an intuitive UI, logging, error handling, documentation, packaging, versioning, containerization, etc. is so tedious.
I'm bewildered when I read posts by the naysayers, because I'm sitting here building polished apps in a fraction of the time, and they work. At least much better than what I was able to build over a couple of weekends. They provide real value to me. And I'm still having fun building them.
I now vibe coded three apps, two of them web apps, in Rust, and I couldn't write a "Hello World" in Rust if you held a gun to my head. They look beautiful, are snappy, and it being Rust gives me a lot of confidence in its correctness (feel free to disagree here).
Of course I wouldn't vibe code in a serious production project, but I'd still use an AI agent, except I'd make sure I understand every line it puts out.
I can understand you don't want to spend effort for throwaway code.
> in a serious production project, but I'd still use an AI agent, except I'd make sure I understand every line it puts out.
That isn't going to cut it. You need to understand the problem domain, have a deep design taste to weigh current and future demands, form a conceptually coherent solution, formalize it to code, then feed back from the beginning. There is no prompt giving your AI those capabilities. You end up with mediocre solutions if you settle for understanding every line it spits out. To be fair, many programmers don't have those capabilities either, so it also a question of quality expectations.
I believe you can use LLMs as advanced search and as a generator for boilerplate. People liking it easy are also being easy with quality attributes, so anyone should be self aware where they are on that spectrum.
> Creating a polished, usable app is just so much work, and so much of it isn't fun at all (to me).
Then don’t do it. No one is forcing you. Are you also going to complain that building airplanes and ensuring food safety are too much work and not fun for you? Not everything needs to be or should be dumbed down to appeal to lowest common denominator.
Alternatively, go work at a company where you’re part of a team and other people do what you do not enjoy.
> I'm sitting here building polished apps in a fraction of the time
No, no you are not, guaranteed. “Polishing” means caring about every detail to make it perfect. If you’re letting the LLM make most of it, by definition it’s not polished.
No one is also keeping me from doing what I want to spend my time with on my days off.
> Are you also going to complain that building airplanes and ensuring food safety are too much work and not fun for you?
No, because this isn't remotely comparable to weekend hobby projects. What a weird question.
> No, no you are not, guaranteed. “Polishing” means caring about every detail to make it perfect. If you’re letting the LLM make most of it, by definition it’s not polished.
I guess we have different definitions of "polished" then.
> No, because this isn't remotely comparable to weekend hobby projects.
I agree. But those also don’t need:
> intuitive UI, logging, error handling, documentation, packaging, versioning, containerization, etc. is so tedious.
Some of that, sure, but not all of it. Either it’s a weekend hobby project or it’s not, and your description is conflating both. A hobby is something done for fun.
He said fun, not easy. Sometimes it's precisely doing brainless stuff over and over again that becomes hard, like writing a template displaying a table of your results or implementing filter and pagination on a web app. I don't feel like I'm growing anymore when doing those things. Or even for some tests. Or when you need a Bash script automating menial stuff. (Still you could find new perspective on things.)
> Sometimes it's precisely doing brainless stuff over and over again that becomes hard, like writing a template displaying a table of your results or implementing filter and pagination on a web app.
I always have a hard time taking this complaint seriously, because the solution is absolutely trivial. Write a snippet. Have you really been out there, year after year, rewriting the same shit from scratch over and over? Just make a snippet. Make it good and generic and save it. Whenever you need to do something repeated on a new project, copy it (or auto-expand if you use it that often) and adapt. Snippet managers are a thing.
Or better yet, refactor your app so it doesn't require so much boilerplate - surely if you're doing the same thing over and over again you can just extract it into it's own function / method and abstract over it.
Of course I wouldn't vibe code in a serious production project, but I'd
still use an AI agent, except I'd make sure I understand every line it
puts out.
So you value your ability to churn out insignificant dreck over the ability of others to use the internet? Because that's the choice you're making. All of the sites that churn your browser for a few seconds because they're trying to block AI DDoS bots, that's worth your convenience on meaningless projects? The increased blast radius of Cloudflare outages, that's a cost with foisting on to the rest of the internet for your convenience?
This is such a... unique angle. Of all the things to get angry at AI for, web crawlers and the impact on cloudflare outages are the ones that really grinds your gears?
For myself, I’ve always enjoyed “getting my hands dirty” with code, and the advent of LLMs have been a boon. I’m retired from 34 years of coding (and managing), and never skipped a beat. I’ve released a few apps, since retiring. I’m currently working on the first app that incorporates a significant amount of LLM assistance. It’s a backend admin tool, but I’ll probably consider using the same methodology for more public-facing stuff, in the future.
I am not one to just let an LLM write a whole app or server, unsupervised (I have control issues), but have allowed them to write whole functions, and help me to find the causes of bugs.
What LLMs have given me, is a decreased hesitance to trying new things. I’ve been learning new stuff at a furious rate. My experience makes learning very fast. Having a place to ask questions, and get [mostly] good answers (experience helps me to evaluate the answers), is a game-changer.
> “A ship in harbor is safe, but that is not what ships are built for.”
–John A. Shedd
This is more than just a bad side project - it's borderline malicious.
How confident is the OP that every single one of these 60 calculators work all the time, with all edge cases? Because if someone is on your website using your calculator, they are putting trust in you. If it's wrong, it could have downstream impacts on them. I hope every single one has a comprehensive set of tests with good edge cases. But realistically will they?
I'm actually pretty pro-AI development. But if you're going to use AI to help develop a website, at least focus on quality rather than quantity. AI makes quantity easy, but quality is still hard.
As an aside, the website doesn't even work for me. My clicks don't don anything.
This. I have so many things to say about the site, but have been withholding them in fear of "posting shallow dismissals, especially of other people's work"
Same here. I’m an AI professor, but every time I wanted to try out an idea in my very limited time, I’d spend it all setting things up rather than focusing on the research. It has enabled me to do my own research again rather than relying solely on PhD students. I’ve been able to unblock my students and pursue my own projects, whereas before there were not enough hours in the day.
I'm not a bot. I'm not a native English speaker. I taught Enlish by myself. so I tried to use ai to tranlate what I really want to say. ( these words is typing by myself instead of AI)
If that’s the case, then mentioning using LLMs to help translate/organise what you want to say in your messages might be taken a bit better by others.
If you want to use LLMs to help express something you don’t know the words for in English then that is a good use for LLMs, if it’s called out. Otherwise your messages scream LLM bot to native speakers.
“You’re absolutely right”, “That hits different”, “Good call!” “–“ are all classic LLM giveaways.
I’m not a moderator here, so you don’t have to listen to me either way.
By contrast, the moment I am no longer able to compete with AI users, is the moment I quit the industry. I have no interest in outsourcing my thinking.
Thankfully LLMs are still very stupid. Especially when it comes to security engineering, my specialty, so looks like I have a while yet.
The key phrase here is "I still had domain expertise". Many miss that AI is a multiplier. If you multiply 0 by AI, you get 0 (or hallucinated garbage). You multiplied your knowledge of compound interest and UX by AI's speed.
Without your background, the AI would have generated a beautiful interface that calculates mortgages using a savings account formula. Your role shifted from "code writer" to "logic validator" - this is the future of development for domain specialists
Similar path here - studied physics, worked in accounting/finance for years, hadn't shipped code in forever. The thing that clicked for me wasn't the AI itself but realising my domain knowledge had actually been compounding the whole time I wasn't coding.
The years "away" gave me an unusually clear picture of what problems actually need solving vs what's technically interesting to build. Most devs early in their careers build solutions looking for problems. Coming back after working in a specific domain, I had the opposite - years of watching people struggle with the same friction points, knowing exactly what the output needed to look like.
What I'd add to the "two camps" discussion below: I think there's a third camp that's been locked out until now. People who understand problems deeply but couldn't justify the time investment to become fluent enough to ship. Domain experts who'd be great product people if they could prototype. AI tools lower the floor enough that this group can participate again.
The $100 spent on Opus to build 60 calculators is genuinely good ROI compared to what that would have cost in dev hours, even for someone proficient. That's not about AI replacing developers - it's about unlocking latent capability in people who already understand the problem space.
Turing Test is not really science (an infallible test, measurable outcome). An AI might never be able to pass TT for all humans. Just gets to be a high-def AI. Makes TT a technology.
It's a shame to find an AI-written ad so highly upvoted here.
The author even insists that AI was used because of their poor English, which is the standard excuse on Reddit as well. But clearly, this is not a translation:
> Curious if others have similar stories. Anyone else come back to building after stepping away?
This is bog-standard AI slop to increase engagement.
Look at the blog on their linked site as well. AI-generated posts.
This has been posted here for SEO. This is a business venture.
It's times like this when I think HN needs a post downvote button. Flagging might not be quite appropriate here, but I hate to see this content cluttering up the front page.
You improve over time. I've been programming for 6 years and I still feel like I'm nowhere near others. That's a completely fine and valid thing to feel.
I’m glad to see people finding coding accessible again. To me this kind of common “AI made coding fun and accessible again” message signals something deeper. As a field, we allowed our systems to get so complex that we lost people: and AI tools are bringing them back. Maybe we should look at how we have chosen to design systems and say “can these be made simpler and more accessible”? Even before AI systems I looked at my field with sadness: there is complexity growing everywhere and few people looking to address that. Instead, we seem to have incentivized creating complexity because new complicated systems that are hard to use lead to career advancement if you can point at something and say “I am one of the few who can deal with that” or “I created that complex thing”. The ability to handle the complexity makes an individual valuable even though the effect is it excludes many others.
Perhaps if we didn’t have deep layer cakes of frameworks and libraries, people would feel like they can code with or without AI. Feels like AI is going to hinder any efforts to address complexity and justify us living with unnecessary complexity simply because a machine can write the complex, hard to understand, brittle code for us.
One thing that’s always missing from these compound interest calculators is multiple assets with different rates, and different rates over time (e.g between X date and Y date use Z rate, etc). I didn’t quite figure out the right UI for the second one.
Yeah enjoying it too, though it’s a different type of joy than hand rolling it. More getting things done fast which is neat but less proud of what one crafted
Can definitely understand the reluctance people feel around it. Especially when they’ve invested years into it and have their livelihood on the line
I’m also quite reluctant to publish any of it. Doesn’t feel right to push code I don’t fully understand so mostly personal projects for now
I've lost the joy in programming, the only thing I'm good at, I now make horrible music, but at least I don't exist as the means to an end that I don't control.
No Offense taken, but what's the point in using AI for anything unless you don't want to do it? I want live my life not consume information, is that really so bad?
Happy compunding! Wish I had started younger but catching up. 25% of your salary into a pension in global indexes I think is the way. You never get to touch it, no decisions to make and just forget it. Live life. Have a lot of money later. (Maybe go down to 5% for when needed e.g. buying a house. Having a baby)
I use AI as a senior developer I ask questions to. It gives me an answer, which I can use on my work or not. Saved me days of work, but I couldn't be taken out (yet) of the loop because I'm still making the decisions...
I don’t like AI for production code, but I love it for ideation and prototyping. I agree. It really allows you to quickly iterate on ideas without being blocked by implementation details.
Not to be disrespectful, but OP's code is also a website that already exists literally thousands of times and could be done in any spreadsheet program without any programming at all...
It’s more like AI provides the development team, and you are the key user and product manager that comes with all the requirements and domain knowledge, the lead architect reviewing the architecture, and the lead UXer reviewing the UX.
Thank you for the beautiful story. I work as a developer and have experienced the same in my personal projects, linux setup and - in general - all the collaterals.
AI is eroding the entry barrier, the cognitive overload, and the hyper-specialization of software development. Once you step away from a black-and-white perspective, what remains is: tools, tools, tools. Feels great to me.
I think people would have reacted a lot more positively if you'd said right up front in the first line "hey look guys, yes I wrote this with ChatGPT but I am not a native English speaker so I've used AI to translate"
Otherwise it feels deceptive. Which is surprising given we should judge off intentions and not augmentation (like come on guys this is HN FFS).
This guy's not running any ads on the site, hasn't spammed with multiple posts that I've seen. I still think investment funds/modern stock exchanges are needless parasites upon society but that's just my opinion.
Congrats! I never stopped coding, but AI makes it way more productive and fun for sure.
$100 seems like a lot. I guess if you think about it compared to dev salaries, it's nothing. But for $10 per month copilot you can get some pretty great results too.
$100 did feel steep at first. I tried other models but Opus 4 with extended thinking just hits different — it actually gets what I'm trying to do and the code often works first try. Hard to go back after that.
>The problem? Every compound interest calculator online is terrible. Ugly interfaces, ads covering half the screen, can't customize compounding frequency properly, no year-by-year breakdowns. I've tried so many. They all suck.
Yeah, you're right — that part is pretty rough. I wanted to help people actually understand compound interest (it's kind of life-changing once it clicks), but I got lazy and let AI do it without proper editing. Defeats the whole point.
I'll figure out a better way. Thanks for calling it out.
These posts will destroy this place. Post your AI written tools if you like - fine, but using an LLM to reply to comments is just insulting, and will make this place a wasteland of LLM. I wouldn’t post this if I didn’t care about the usual good quality of the discussions on this site.
Just another AI generated website with 5000 calculators thrown together that looks like every other single one. From a brand new account with a post that looks like it was also written from ChatGPT. Somehow getting enough votes to show up on my homepage.
Things are definitely changing around HN compared to when it first started.
Fair call — it did kind of explode from one calculator to 60+
I’m a real person (long-time lurker, finally posting), but I get why it looks sus.
Things are changing fast, and I’m just happy to be part of the messy early wave. Thanks for the honesty.
It's impossible to tell if this is AI or not. Another version of Poe's law. The only thing to do is assume everything is AI, just like you must assume all posts have ulterior (generalluy profit-driven) motives, all posters have a conflict of interest, etc.
Maybe the only thing to do is stop trying to understand posters' motivations, stop reading things charitably, stop responding, just look for things that are interesting (and be sure to check sources).
We're busy building real software, not toys. I routinely write all kinds of calculators in my game development, in addition to having 100x more complex code to contend with. This task is as trivial as it gets in coding, considering computers were literally made to calculate and calculation functions are part of standard libraries. OP definitely didn't use Claude to implement math functions from scratch, they just did the basic copy-and-paste work of tying it to a web interface on a godawful JS framework stack which is already designed for children to make frontends with at the cost of extreme bloat and terrible performance. Meanwhile I actually did have to write my own math library, since I use fixed-point math in my game engine for cross-CPU determinism rather than getting to follow the easy path of floating-point math.
It's cool that ChatGPT can stitch these toys together for people who aren't programmers, but 99% of software engineers aren't working on toys in the first place, so we're hardly threatened by this. I guess people who aren't software engineers don't realise that merely making a trivially basic website is not what software engineering is.
> I guess people who aren't software engineers don't realise that merely making a trivially basic website is not what software engineering is.
"Software engineering" doesn't matter to anyone except to software engineers. What matters is executing that idea that's been gathering dust for ages, or scratching that pain point that keeps popping up in a daily basis.
Software engineering matters very much to anyone who has ideas or pain points that are beyond the capabilities of a next-token prediction engine to solve.
My response is perhaps a bit raw, but so is the quote above.
Stop with the gate keeping. I've studied CS to understand coding, not to have some sort of pride to build "real software". Knowledge is a tool, nothing more, nothing less.
There are enough developers whose whole job it is to edit one button per week and not much more. And yes, there are also enough developers that actually apply their CS skills.
> but 99% of software engineers aren't working on toys in the first place
Go outside of your bubble. It's way more nuanced than that.
> I guess people who aren't software engineers don't realise that merely making a trivially basic website is not what software engineering is.
Moving goal posts. Always has been.
It's not that I fully disagree with you either. And I'm excited about your accomplishments. But just the way it reads... man...
I guess it hits me because I used to be disheartened by comments like this. It just feels so snarky as if I am never good enough.
The vibe is just "BUH BUH BUH and that's it." That's how it comes across.
And I've come to mature enough to realize I shouldn't feel disheartened. I've followed enough classes at VUSEC with all their rowhammer variations and x86-64 assignments to have felt a taste of what deep tech can be. And the thing is, it's just another skill. It doesn't matter if someone works on a web app or a deep game programming problem.
What matters (to me at least) that you feel the flow of it and you're going somewhere touching an audience. Maybe his particular calculator app has a better UX for some people. If that's the case, then his app is a win. If your game touches people, then that's a win. If you feel alive because you're doing complex stuff, then that's a win (in the style of "A Mathematician's Apology"). If you're doing complex stuff and you feel it's rough and you're reaching no one with it, it's neutral at best in my book (positive: you're building a skill, negative: no one is touched, not even you).
Who cares what the underlying technology is. What's important is usability.
Feel free to point out where I moved goal posts. To say that I moved goal posts would imply that at one point I stated that creating a trivial website was software engineering. If you're comparing my statement to what some other person said, who made arguments I did not make, then we cannot have any kind of constructive dialogue. At that point you are not talking to me, but talking to an imaginary projection of me meant to make yourself feel better about your argument.
> Stop with the gate keeping.
I'm not gatekeeping anything. You can disagree with my descriptive terms if you want, but the core point I'm trying to get across is: what people are doing with Claude can not replace what I do. I would know, I've tried extensively. Development is a lot of hard work and I would love it if my job were easier! I use LLMs almost every day, mostly for trivial tasks like reformatting text or writing advanced regex because I can't be bothered to remember the syntax and it's faster than looking it up. I also routinely pose SOTA models problems I'm working on to have them try to solve them, and I am routinely disappointed by how bad the output is.
So, in a thread where people were asserting that critics are merely critics because they're afraid of being replaced I pointed out that this is not factually correct, that no, we're not actually afraid of being replaced, because those of us who do "real" engineering (feel free to suggest a different term to substitute for "real" if the terminology is what bothers you) know that we cannot be replaced. People without experience start thinking they can replace us, that the exhilarating taste of coding they got from an LLM is the full extent to the depth of the software engineering world, but in fact it is not even close.
I do think that LLMs fill a useful gap, for projects where the time investment would be too large to learn to code and too unimportant to justify paying anyone to program, but which are simple enough that a non-engineer can have an LLM build something neat for themselves. There is nothing wrong with toys. Toys are a great thing to have in the world, and it's nice that more people can make them[1]. But there is a difference between a toy and what I do, and LLMs cannot do the thing I do. If you're taking "toy" in a derogatory manner, feel free to come up with another term.
[1] To some extent. While accessibility is generally a great thing, I have some misgivings. Software is dangerous. The web is arguably already too accessible, with frameworks enabling people who have no idea what they're doing to make professional-looking websites. These badly-made websites then go on to have massive security breaches that affect millions of users. I wish there was a way to make basic website development accessible, whether through frameworks or LLMs, in a way that did not give people using them the misplaced self-confidence to take on things way above their skill level at the cost of other people's security.
Idk, your superiority complex about the whole issue does make it sound like you’re feeling threatened. You seem determined to prove that AI can’t really make any decent output.
What’s even the point of writing out that first paragraph otherwise?
> What’s even the point of writing out that first paragraph otherwise?
I was correcting your misguided statement:
> Their critics didn’t make that!
by pointing out that we, among other things, build the libraries that you/Claude are copy-and-pasting from. When you make an assertion that is factually incorrect, and someone corrects you, that does not mean they are threatened.
You're right that this is simple compared to what real engineers build. I have a lot of respect for people like you who write things like custom math libraries for cross-CPU determinism — that's way beyond my level.
I'll keep learning and try to make this less of a toy over time. And hopefully I can bring what I've learned from years in investing into my next product to actually help people. Thanks for the perspective.
What are you implying?. He would have had to hire a good developer at least for a full month salary to build something like this.
And if you are thinking enterprise, it would take 2-3 developers, 2 analysts, 2 testers, 1 lead and 1 manager 2-3 months to push something like this. (Otherwise why would lead banks spent billions and billions for IT development every year? What tangible difference you see in their website/services?)
5000 calculators may look excessive, but in this case it magnifies the AI capabilities in the future - both in terms of quality and quantity.
> (Otherwise why would lead banks spent billions and billions for IT development every year? What tangible difference you see in their website/services?)
Well, I don't think all those people are spending their time making simple calculators.
Twitter/X incentivizes you to get engagements because with a blue checkmark you get paid for it, so people shill aggressively, post idiotic comments on purpose trying to ragebait you. It's like LinkedIn in for entrepreneurs. Reddit or it's power hungry moderators (shadow)bans people often. The amount of popular websites that people can shill their trash is dwindling, so it gets worse here as a result I assume too.
Well in my opinion there's nothing wrong with vibe-coding. You can completely use it to make your passion projects. I draw the line when people try to sell their vibe-coded project as something huge, putting people at the risk of potential security breaches while also taking money out of them.
Every other day I see ads of companies saying "use our AI and become a millionaire", this kind of marketing from agentic IDEs implies no need for developers who know their craft, which as said above, isn't the case.
Fair, but the threat model matters here. For a static mortgage calculator, the data leak risk is zero (if it's client-side). The risk here is different - logical. If the AI botches the formula and someone makes a financial decision based on that - that's the problem. For "serious" projects vibe coding must stop where testing and code audits begin
Totally agree. I have my day job, and vibe-coding has simply brought back the joy of building things for me. It should be about passion and creativity, not about scamming people or overselling half-baked products. The "get rich quick with AI" narrative is toxic.
this by definition filters out all non-devs, even many junior devs as you need to understand deeply if those tests are correct and cover all important edge cases etc.
+ when you deploy it - you need to know it was properly deployed and your db creds are not on frontend.
But mostly no one cares as there is no consequences to leaking personal data of your users or whatnot.
I think vibe coding isn't quite good enough for real products because I usually have 4 AI agents going non-stop. And I do read the code (I read so, so much code), and I give the AI plenty of feedback.
If you just want to build a little web app, or a couple of screens for your phone, you'll probably be fine. (Unless there's money or personal data involved.) It's empowering! Have fun.
But if you're trying to build something that has a whole bunch of moving parts and which isn't allowed to be a trash fire? Someone needs to be paying attention.
Same. Fell out of love with programming after the first few years because the thought of spending my life staring at a screen and dealing with insignificant minutia suddenly seemed horrible. Spent a lot of years in management and LLMs gave me a way to build things I wanted again. Currently building a platformer.
This is tongue-in-cheek, but you spent years in management because "the thought of spending your life staring at a screen and dealing with insignificant minutia seemed horrible?" I need to read your management book!
It’s a lot of 1:1s and talking to people directly and strategy about setting up performant teams. I enjoy it way more and don’t spend a lot of time looking at screens.
> Stack: Next.js, React, TailwindCSS, shadcn/ui, four languages (EN/DE/FR/JA). The AI picked most of this when I said "modern and clean."
I guess this is what separates some people. But I always explicitly tell it to use only HTML/JS/CSS without any libraries that I've vetted myself. Generating code allows you now not having to deal with it a lot more.
Cool to hear nonetheless. Can we now also stop stigmatizing AI generated music and art? Looking at you Steam disclosures.
For me it’s kinda the same. I always hated typing actual code, I love planing, reading, finding bugs etc.
But writing code? Eh, I never enjoyed that. Now with agents I can kinda do exactly what I like, plan, write in natural langue and then do code review.
Also we are still designing systems and have to be able to define the problem properly, at least in my company when we look at the velocity in delivering projects it is barely up since AI because the bottlenecks are elsewhere..
Do you truly believe it won't get better, maybe even better at whole system design and implementation than people?
They've even got their own slogan: "you're probably just not prompting it properly"
Just like SEO experts, marketing experts, trade bots and crypto experts; the vibe coders will weed out.
It's a miracle. Simply wouldn't have been done before. I think we'll see an explosion of software in small and midsize companies.
I admit it may be crappy software, but as long as the scope is small - who cares? It certainly is better than the janky manual paper processes, excel sheets, or just stuff in someone's head!
And to be honest, even the tiny apps I'm doing I wouldn't have been able to do without some background in how frontend / backend should work, what a relational database is, etc. (I was an unskilled technical PM in the dotcom boom in the 2000s so at least know my way around a database a little. I know what these parts of tech CAN do, but I didn't have the skills to make them do it myself.)
1. Invoice billing review. Automated 80% of what was a manual process by providing AI suggestions in an automated way. Saved 3 hours per day of managers time. Increased topline by 10%. Dev time: 1 day
2. Data dashboards. We use janky saas that does not have APIs. Automated a scraper to login, download the reports daily, parse and upload to a database, and build a dashboard. Used to take my associate 3 hours per week to do this in a crappy spreadsheet. Now I have it in a perfect database much more frequently. Dev time: 4 hours.
We are attacking little problems all across the business now.
A MIRACLE!!!!
I wouldn't want to hassle customers who have fully paid up accounts
is there a term for that?
AI at our fingertips, accessible and useful, that's just a tool, that's not redefining us as an industry and denying people's jobs – that's an asset. (I used an em dash to prove I am not AI, as apparently double dash is now a sign of AI text!)*
(*) case in point, the situation is _TIRING_.
"Comment NEAT to receive the link, and don't forget to connect so I can email you" -- this is the most infuriating line ever.
In comms, they have something like a 1:4 ratio of design to validation engineers. Defence is slightly different, as it depends on the company, but generally the tolerance for bugs is zero. Lets not get started on the HF trading folks and their risk appetite!
There's a lot of room for software engineers. Most FPGAs are SoC devices now, running some form of embedded linux doing high-level task management networking. Provided you know enough Verilog to know your way around, you'll be fine. You're also in a space where most engineers I know are preparing to retire in the next 5-10 years, so there will be a panic which will ripple across industries.
Security engineers will have jobs until software is perfectly secure... and that is going to be a while.
I do not use LLMs at all to do my job, and it is unlikely I ever would. Clients pay me -after- they had all their favorite LLMs take a pass.
One day I might start a consultancy business that only does artisanal code. You can hire me and my future apprentices to replace AI code with handcrafted code. I will use my company to teach the younger generation how to write code without AI tooling.
That's an interesting perspective. I guess it depends on what you want and how low the stakes are. Artisanal coffee, sure. Artisanal clothing, why not? Would you want an artisanal MRI machine? Not sure. I wouldn't really want it "hand crafted", I just want it to do it's job.
These existed before but the culture surrounding AI delivered a double dose of both.
I have no problems with LLMs themselves or even how they are used but it has developed its own religion filled with dogma, faith based reasoning and priests which is utterly toxic.
The tools are shoved down our throats (thanks to the priesthood, AI use is now a job performance criteria) and when they fail we are not met with curiosity and a desire to understand but with hostility and gaslighting.
The cost of hallucinations though - you potentially have a stronger point there. It wouldn’t surprise me if that fails to sway some decision makers but it doesn’t give the average dev a bit more ground to work with.
It's not so great for the one or two but fantastic for everybody else.
With AI, you are no longer a developer, you're a product manager, analyst, or architect. What's neat about this, from a business perspective, is that you can in effect cut out all your developers and have a far smaller development workforce consisting of only product managers, analysts, and architects whom you call "developers" and pay developer salaries to. So you save money twice: once on dev workforce downsizing, and again on the pay grade demotion.
I'm currently exploring domain-specific languages aimed at writing web applications. I've been particularly interested in, much like bash, data flowing through pipelines. I have spent quite a bit of time and I'm definitely not vibe coding but I've probably only writen 1-2% of the code in these projects.
It is so much work to build out a new language with a surrounding ecosystem of tooling. Not even five years ago this would have necessarily been a full time multi-year endeavor or at least required a team of researchers. Now I can tinker away in my off hours.
This is what I am exploring:
https://williamcotton.com/articles/the-evolution-of-a-dsl
Did I not craft the syntax and semantics of these languages?
What about the phone in your hand, did you design that?
HN loves to believe they are the noble few - men and women of math and science, driven by nothing but the pure joy of their craft
But this whole AI thing has been super revealing. Almost everyone here is just the same old same old, only that now that the change is hitting close to home, you’re clutching your pearls and lamenting the days when devs were devs
The younger generation born into the AI world is going to leave you in the dust because they aren’t scared of it
My math teacher used to say that people felt this was about…calculators, imagine that
This is such marketing speak. The words mean nothing, they’re just a vague amalgamation of feelings. “Vibes”, if you will.
If you “love delivering value and solutions”, go donate and volunteer at a food bank, there’s no need for code at any point.
> The happy consumer and the polished product
More marketing speak. If you are using LLMs to write your code, by definition your product isn’t “polished”. Polishing means pouring over every detail with care to ensure perfection. Letting an LLM spit out code you just accept is not it.
The word you’re looking for is “shiny”, meaning that it looks good at a glance but may or may not be worth anything.
I get the argument. Sometimes I really enjoyed the actual act of finally figuring out a way to solve a problem in code, but most of the time it was a means to an end, and I'm achieving that end far more often now via AI tooling.
I’m not fussed about the exact term, as long as it points to something real and at semantic equal footing with the alternative.
Note how they described two areas of focus (what you “love”): “coding” and “delivering value/solutions”.
You can be a “coder” or a “programmer”, no one is a “deliverer of value/solutions”.
“Coding” is explicit, it’s an activity you can point at. “Delivering values/solutions” is vague, it’s corporate speak to sound positive without committing to anything. It doesn’t represent anything specific or tangible. It doesn’t even reference software, though it’s what it is, to make it sound broader than what it is. You could say “using and releasing apps”, for example, thought proponents may feel that’s reductive (but then again, so is “coding”).
Again, what’s in contention here isn’t the exact term, but making sure it’s one that actually means something to humans, instead of marketing speak.
I'm extremely diligent around vetting all code in my repo's. Everything is thoroughly tested and follows the same standards that were in my codebase before the invention of LLM's. I'm not "vibe coding". You're making assumptions because of your negative emotional reaction to LLM's.
I actually think this reveals more about you than you might realise. A _lot_ of people enjoy being able to help people resolve problems with their skills. Delivering value is marketing speak, but it's specifically helping people in ways that's valuable.
A lot of people who work in software are internally motivated by this. The act of producing code may (or may not be) also enjoyable, but the ultimate internal motivation is to hand over something that helps others (and the external motivation is obviously dollars and cents).
There is also a subset of people who enjoy the process of writing code for its own sake, but it's a minority of developers (and dropping all the time as tooling - including LLMs - opens development to more people).
> If you are using LLMs to write your code, by definition your product isn’t “polished”. Polishing means pouring over every detail with care to ensure perfection.
You can say the same thing about libraries, interpreters, OSes, compilers, microcode, assembly. If you're not flipping bits directly in CPU registers, your not pouring over every little detail to ensure perfection. The only difference between you and the vibe coder who's never written a single LoC is the level of abstraction you're working at.
Edit:
> If you “love delivering value and solutions”, go donate and volunteer at a food bank, there’s no need for code at any point.
I also think this says maybe a lot about you, also, as many people also donate their time and efforts to others. I think it may be worth some self-reflection to see whether your cynicism has become nihilism.
I did use to volunteer at a food bank, but I used that example only because it’s quick and simple, no shade on anyone who doesn’t. I stopped for logistical reasons when COVID hit.
I have used the set of skills I’m god at to help several people with their goals (most were friends, some were acquaintances) who later told me I changed their life for the better. A few I no longer speak to, and that’s OK.
Oh, and before I became a developer, I worked in an area which was very close to marketing. Which was the reason I stopped.
So yeah, I know pretty well what I’m talking about. Helping others is an explicit goal of mine that I derive satisfaction from. I’d never describe it as “delivering value/solutions” and neither would any of the people I ever helped, because that’s vague corporate soulless speech.
How do you feel about the fact that OpenAi et al have slurped up all your code and are now regurgitating it for $20/month?
I also don’t think “but it wouldn’t be viable otherwise” is a valid defence.
I don’t see what that has to do with the conversation, though. If your point is about the free/$20, that doesn’t really factor into my answer.
> More marketing speak. If you are using LLMs to write your code, by definition your product isn’t “polished”.
This doesn’t make any sense. Polished to who? The end user? You can absolutely use AI to polish the user experience. Whether coding by hand or AI the most important aspect of polish is having someone who cares.
Can't the customer now just skip you and generate a product for himself via AI?
Or to phrase it more succinctly: if you are in camp 2 but don't have the passion of camp 1, you are a threat for the long term. The reverse is dangerous too, but can be offset to a certain extent with good product management.
This is solved problem with any large, existing, older code base. Original writers are gone and new people come on all the time. AI has actually helped me get up to speed in new code bases.
Is this also true of all third party code used by their solution? Should they make all libraries and APIs they use their own in exactly in the form it needs to be according to their deep expertise? If not, why not?
If so, does this extend to the rest of the stack? Interpreters, OSes, drivers? If not, why not?
This isn't a trick question, BTW. It's a genuine attempt to get to the rationale behind your (and the GP's) stance on this.
In particular, the GP said:
> Or to phrase it more succinctly: if you are in camp 2 but don't have the passion of camp 1, you are a threat for the long term.
That hints I think at their rationale, that their stance is based on placing importance on the parts of software development that they enjoy, rather than any logical basis.
This happens, but very rarely compared to changes in your own code base. If a library breaks, you can usually find an alternative, but even in that case you need to know how to modify your own code.
The difference with generated code is that you are tasked to maintain the generated code.
I don't think this is true, but say we accept it.
> The difference with generated code is that you are tasked to maintain the generated code.
Is this a task that LLMs are incapable of performing?
That's what people tend to report, yes.
I think for a lot of minor things, having AI generate stuff is okay, but it’s rather astounding how verbose and sometimes bizarre the code is. It mostly works, but it can be hard to read. What I’m reading from a lot of people is that they’re enjoying coding again because they don’t have to deal with the stuff they don’t want to do, which...I mean, that’s just it isn’t it? Everyone wants to work on what they enjoy, but that’s not how most things work.
Another problem is that if you just let the AI do a lot of the foundational stuff and only focus on the stuff that you’re interested in, you sometimes just miss giant pieces of important context. I’ve tried reading AI driven code, sometimes it makes sense, sometimes it’s just unextensible nonsense that superficially works.
This isn’t tech that should replace anything and needs to be monitored judiciously. It can have value, but what I suspect is going to happen is we are going to have a field day with people fixing and dealing with ridiculous security holes for the next decade after this irrational exuberance goes away. It should be used in the same way that any other ML technique should be. Judiciously and in a specific use case.
Said another way, if these models are the future of general programming, where are the apps already? We’re years into this and where are they? We have no actual case studies, just a bunch of marketing copy and personal anecdotes. I went hunting for some business case studies a while ago and I found a Deloitte “case study” which was just pages of “AI may help” without any actual concrete cases. Where are the actual academic studies showing that this works?
People claiming AI makes them code faster reminds me that Apple years ago demonstrated in multiple human interaction studies that the mouse is faster, but test subjects all thought keyboard shortcuts were faster [1]. Sometimes objective data doesn’t matter, but it’s amusing that the whole pitch for agentic AI is that it is faster and evidence is murky for this at best.
[1] https://www.asktog.com/TOI/toi06KeyboardVMouse1.html
More likely that step is just skipped and replaced with thoughts and prayers.
There are people who would code whether it was their career or not, I'm not one of those people. I fell into software development in order to make money, if the money stopped then I would stop. I love building and selling products, if I can't do that then I have no interest in programming. I'm interested in machines, CPU's, etc. I'm interested in products, liaising with customers, delivering solutions, improving things for users, etc. You think there is no distinction there? Again, there are people who code for fun, I'm simply not one of them...
I like using my software engineering skills to solve people's problems. I don't do coding for it's own sake - there's always a thing I'm trying to implement for someone.
Now I build all sorts of apps for my farm and organizations I volunteer for. I can pound out an app for tracking sample locations for our forage associations soil sample truck, another for moisture monitoring, a fleet task/calendar/maintenance app in hours and iterate on them when I think of features.
And git was brand new when I left the industry, so I only started using it recently to any extent, and holy hell, is it ever awesome!
I'm finally able to build all the ideas I come up with when I'm sitting in a tractor and the GPS is steering.
Seriously exciting. I have a hard time getting enough sleep because I hammer away on new ideas I can't tear myself away from.
I'm teaching my kid what I consider the AI dev stack: AI IDE (Antigravity for us), database (Supabase for us with a nice MCP server), and deployment (Github and Vercel for us). You can make wonderful little integrated apps with this in hours.
Did you take over a farm?
What a stupid sentiment on top of trying to generate views for the most low hanging slop ever.
I'm also now dealing with things that previously would have taken me too long to deal with. For example, I'm actually making a dent in the amount of technical debt I have to deal with. The type of things where previously I maybe wouldn't have taken a week out of my schedule to deal with something that was annoying me. A lot of tedious things that would take me hours/days now can get done in a few prompts. With my bigger projects, I still do most stuff manually. But that's probably going to change over the next months/year.
I'm mainly using codex. I know a lot of people seem to prefer Claude Code. But I've been a happy ChatGPT Plus user for a while and codex is included with that and seems to do the job. Amazing value for 20$/month. I've had to buy extra credit once now.
The flip side of all this is that waiting for AI to do it's thing isn't fun. It's slow enough that it slows me down and fast enough that I can't really multi task. It's like dealing with a very slow build that you have to run over and over again. A necessary evil. But not necessarily fun. I can see why a lot of developers feel like the joy is being sucked out of their lives.
Dealing with this pain is urgent. Part of that is investing in robust and fast builds. Build time competes with model inference in the time stuff takes. And another part is working on the UX of this. Being able to fork multiple tasks at once is hugely empowering. And switching between editing code and generating code needs to get more seamless. It feels too much like I'm sitting on my hands sometimes.
Creating a polished, usable app is just so much work, and so much of it isn't fun at all (to me). There are a few key parts that are fun, but building an intuitive UI, logging, error handling, documentation, packaging, versioning, containerization, etc. is so tedious.
I'm bewildered when I read posts by the naysayers, because I'm sitting here building polished apps in a fraction of the time, and they work. At least much better than what I was able to build over a couple of weekends. They provide real value to me. And I'm still having fun building them.
I now vibe coded three apps, two of them web apps, in Rust, and I couldn't write a "Hello World" in Rust if you held a gun to my head. They look beautiful, are snappy, and it being Rust gives me a lot of confidence in its correctness (feel free to disagree here).
Of course I wouldn't vibe code in a serious production project, but I'd still use an AI agent, except I'd make sure I understand every line it puts out.
I believe you can use LLMs as advanced search and as a generator for boilerplate. People liking it easy are also being easy with quality attributes, so anyone should be self aware where they are on that spectrum.
Then don’t do it. No one is forcing you. Are you also going to complain that building airplanes and ensuring food safety are too much work and not fun for you? Not everything needs to be or should be dumbed down to appeal to lowest common denominator.
Alternatively, go work at a company where you’re part of a team and other people do what you do not enjoy.
> I'm sitting here building polished apps in a fraction of the time
No, no you are not, guaranteed. “Polishing” means caring about every detail to make it perfect. If you’re letting the LLM make most of it, by definition it’s not polished.
No one is also keeping me from doing what I want to spend my time with on my days off.
> Are you also going to complain that building airplanes and ensuring food safety are too much work and not fun for you?
No, because this isn't remotely comparable to weekend hobby projects. What a weird question.
> No, no you are not, guaranteed. “Polishing” means caring about every detail to make it perfect. If you’re letting the LLM make most of it, by definition it’s not polished.
I guess we have different definitions of "polished" then.
I agree. But those also don’t need:
> intuitive UI, logging, error handling, documentation, packaging, versioning, containerization, etc. is so tedious.
Some of that, sure, but not all of it. Either it’s a weekend hobby project or it’s not, and your description is conflating both. A hobby is something done for fun.
thats why it was valuable.
All things worth doing are hard.
I always have a hard time taking this complaint seriously, because the solution is absolutely trivial. Write a snippet. Have you really been out there, year after year, rewriting the same shit from scratch over and over? Just make a snippet. Make it good and generic and save it. Whenever you need to do something repeated on a new project, copy it (or auto-expand if you use it that often) and adapt. Snippet managers are a thing.
Thanks.
For myself, I’ve always enjoyed “getting my hands dirty” with code, and the advent of LLMs have been a boon. I’m retired from 34 years of coding (and managing), and never skipped a beat. I’ve released a few apps, since retiring. I’m currently working on the first app that incorporates a significant amount of LLM assistance. It’s a backend admin tool, but I’ll probably consider using the same methodology for more public-facing stuff, in the future.
I am not one to just let an LLM write a whole app or server, unsupervised (I have control issues), but have allowed them to write whole functions, and help me to find the causes of bugs.
What LLMs have given me, is a decreased hesitance to trying new things. I’ve been learning new stuff at a furious rate. My experience makes learning very fast. Having a place to ask questions, and get [mostly] good answers (experience helps me to evaluate the answers), is a game-changer.
> “A ship in harbor is safe, but that is not what ships are built for.” –John A. Shedd
[0] https://littlegreenviper.com/miscellany/thats-not-what-ships...
I'm not sure how you can claim this on the footer of every page when you're vibe coding these calculators.
How confident is the OP that every single one of these 60 calculators work all the time, with all edge cases? Because if someone is on your website using your calculator, they are putting trust in you. If it's wrong, it could have downstream impacts on them. I hope every single one has a comprehensive set of tests with good edge cases. But realistically will they?
I'm actually pretty pro-AI development. But if you're going to use AI to help develop a website, at least focus on quality rather than quantity. AI makes quantity easy, but quality is still hard.
As an aside, the website doesn't even work for me. My clicks don't don anything.
If you want to use LLMs to help express something you don’t know the words for in English then that is a good use for LLMs, if it’s called out. Otherwise your messages scream LLM bot to native speakers.
“You’re absolutely right”, “That hits different”, “Good call!” “–“ are all classic LLM giveaways.
I’m not a moderator here, so you don’t have to listen to me either way.
Thankfully LLMs are still very stupid. Especially when it comes to security engineering, my specialty, so looks like I have a while yet.
No it didn't, in fact, your job shifted from code writer to code fixer
The years "away" gave me an unusually clear picture of what problems actually need solving vs what's technically interesting to build. Most devs early in their careers build solutions looking for problems. Coming back after working in a specific domain, I had the opposite - years of watching people struggle with the same friction points, knowing exactly what the output needed to look like.
What I'd add to the "two camps" discussion below: I think there's a third camp that's been locked out until now. People who understand problems deeply but couldn't justify the time investment to become fluent enough to ship. Domain experts who'd be great product people if they could prototype. AI tools lower the floor enough that this group can participate again.
The $100 spent on Opus to build 60 calculators is genuinely good ROI compared to what that would have cost in dev hours, even for someone proficient. That's not about AI replacing developers - it's about unlocking latent capability in people who already understand the problem space.
Feel like forums have turned into a grand Turing Test.
The author even insists that AI was used because of their poor English, which is the standard excuse on Reddit as well. But clearly, this is not a translation:
> Curious if others have similar stories. Anyone else come back to building after stepping away?
This is bog-standard AI slop to increase engagement.
Look at the blog on their linked site as well. AI-generated posts.
This has been posted here for SEO. This is a business venture.
It's times like this when I think HN needs a post downvote button. Flagging might not be quite appropriate here, but I hate to see this content cluttering up the front page.
You improve over time. I've been programming for 6 years and I still feel like I'm nowhere near others. That's a completely fine and valid thing to feel.
Perhaps if we didn’t have deep layer cakes of frameworks and libraries, people would feel like they can code with or without AI. Feels like AI is going to hinder any efforts to address complexity and justify us living with unnecessary complexity simply because a machine can write the complex, hard to understand, brittle code for us.
Nit: it seems like the graph for the compound interest calculator should start at year 0 rather than year 1.
Also, it might be nice to have a way to change the starting year to the actual year you want to start (such as the current year).
Can definitely understand the reluctance people feel around it. Especially when they’ve invested years into it and have their livelihood on the line
I’m also quite reluctant to publish any of it. Doesn’t feel right to push code I don’t fully understand so mostly personal projects for now
That's creating a new inefficient, socially destructive, environmentally damaging hammer because solving the real problem doesn't sell well.
I'll be happy when we solve THAT problem.
https://youtu.be/JJz5D9txeGA
If this were about grammar, it would be appropriate to translate something you wrote, not use generative AI to create it.
This whole thing is an ad. All the post's sentiments that people are engaging with ("imposter syndrome" etc.) were spit out by a clanker.
What a disheartening start to my morning.
AI is eroding the entry barrier, the cognitive overload, and the hyper-specialization of software development. Once you step away from a black-and-white perspective, what remains is: tools, tools, tools. Feels great to me.
Otherwise it feels deceptive. Which is surprising given we should judge off intentions and not augmentation (like come on guys this is HN FFS).
This guy's not running any ads on the site, hasn't spammed with multiple posts that I've seen. I still think investment funds/modern stock exchanges are needless parasites upon society but that's just my opinion.
$100 seems like a lot. I guess if you think about it compared to dev salaries, it's nothing. But for $10 per month copilot you can get some pretty great results too.
Edit: I appreciate the quick turnaround. Apologies.
Have you tried this? https://www.investor.gov/financial-tools-calculators/calcula...
I'll figure out a better way. Thanks for calling it out.
https://news.ycombinator.com/newsguidelines.html
Things are definitely changing around HN compared to when it first started.
Every spammer and scammer, even a bot, is ultimately controlled by a real person in some sense. That doesn't mean we want their content here.
It's impossible to tell if this is AI or not. Another version of Poe's law. The only thing to do is assume everything is AI, just like you must assume all posts have ulterior (generalluy profit-driven) motives, all posters have a conflict of interest, etc.
Maybe the only thing to do is stop trying to understand posters' motivations, stop reading things charitably, stop responding, just look for things that are interesting (and be sure to check sources).
Anyone who disagrees with the above are just hurt that their manual hyping has been replaced with machines.
OP made a site with a bunch of calculators. Their critics didn’t make that!
It's cool that ChatGPT can stitch these toys together for people who aren't programmers, but 99% of software engineers aren't working on toys in the first place, so we're hardly threatened by this. I guess people who aren't software engineers don't realise that merely making a trivially basic website is not what software engineering is.
"Software engineering" doesn't matter to anyone except to software engineers. What matters is executing that idea that's been gathering dust for ages, or scratching that pain point that keeps popping up in a daily basis.
My response is perhaps a bit raw, but so is the quote above.
Stop with the gate keeping. I've studied CS to understand coding, not to have some sort of pride to build "real software". Knowledge is a tool, nothing more, nothing less.
There are enough developers whose whole job it is to edit one button per week and not much more. And yes, there are also enough developers that actually apply their CS skills.
> but 99% of software engineers aren't working on toys in the first place
Go outside of your bubble. It's way more nuanced than that.
> I guess people who aren't software engineers don't realise that merely making a trivially basic website is not what software engineering is.
Moving goal posts. Always has been.
It's not that I fully disagree with you either. And I'm excited about your accomplishments. But just the way it reads... man...
I guess it hits me because I used to be disheartened by comments like this. It just feels so snarky as if I am never good enough.
The vibe is just "BUH BUH BUH and that's it." That's how it comes across.
And I've come to mature enough to realize I shouldn't feel disheartened. I've followed enough classes at VUSEC with all their rowhammer variations and x86-64 assignments to have felt a taste of what deep tech can be. And the thing is, it's just another skill. It doesn't matter if someone works on a web app or a deep game programming problem.
What matters (to me at least) that you feel the flow of it and you're going somewhere touching an audience. Maybe his particular calculator app has a better UX for some people. If that's the case, then his app is a win. If your game touches people, then that's a win. If you feel alive because you're doing complex stuff, then that's a win (in the style of "A Mathematician's Apology"). If you're doing complex stuff and you feel it's rough and you're reaching no one with it, it's neutral at best in my book (positive: you're building a skill, negative: no one is touched, not even you).
Who cares what the underlying technology is. What's important is usability.
Feel free to point out where I moved goal posts. To say that I moved goal posts would imply that at one point I stated that creating a trivial website was software engineering. If you're comparing my statement to what some other person said, who made arguments I did not make, then we cannot have any kind of constructive dialogue. At that point you are not talking to me, but talking to an imaginary projection of me meant to make yourself feel better about your argument.
> Stop with the gate keeping.
I'm not gatekeeping anything. You can disagree with my descriptive terms if you want, but the core point I'm trying to get across is: what people are doing with Claude can not replace what I do. I would know, I've tried extensively. Development is a lot of hard work and I would love it if my job were easier! I use LLMs almost every day, mostly for trivial tasks like reformatting text or writing advanced regex because I can't be bothered to remember the syntax and it's faster than looking it up. I also routinely pose SOTA models problems I'm working on to have them try to solve them, and I am routinely disappointed by how bad the output is.
So, in a thread where people were asserting that critics are merely critics because they're afraid of being replaced I pointed out that this is not factually correct, that no, we're not actually afraid of being replaced, because those of us who do "real" engineering (feel free to suggest a different term to substitute for "real" if the terminology is what bothers you) know that we cannot be replaced. People without experience start thinking they can replace us, that the exhilarating taste of coding they got from an LLM is the full extent to the depth of the software engineering world, but in fact it is not even close.
I do think that LLMs fill a useful gap, for projects where the time investment would be too large to learn to code and too unimportant to justify paying anyone to program, but which are simple enough that a non-engineer can have an LLM build something neat for themselves. There is nothing wrong with toys. Toys are a great thing to have in the world, and it's nice that more people can make them[1]. But there is a difference between a toy and what I do, and LLMs cannot do the thing I do. If you're taking "toy" in a derogatory manner, feel free to come up with another term.
[1] To some extent. While accessibility is generally a great thing, I have some misgivings. Software is dangerous. The web is arguably already too accessible, with frameworks enabling people who have no idea what they're doing to make professional-looking websites. These badly-made websites then go on to have massive security breaches that affect millions of users. I wish there was a way to make basic website development accessible, whether through frameworks or LLMs, in a way that did not give people using them the misplaced self-confidence to take on things way above their skill level at the cost of other people's security.
What’s even the point of writing out that first paragraph otherwise?
I was correcting your misguided statement:
> Their critics didn’t make that!
by pointing out that we, among other things, build the libraries that you/Claude are copy-and-pasting from. When you make an assertion that is factually incorrect, and someone corrects you, that does not mean they are threatened.
I'll keep learning and try to make this less of a toy over time. And hopefully I can bring what I've learned from years in investing into my next product to actually help people. Thanks for the perspective.
And if you are thinking enterprise, it would take 2-3 developers, 2 analysts, 2 testers, 1 lead and 1 manager 2-3 months to push something like this. (Otherwise why would lead banks spent billions and billions for IT development every year? What tangible difference you see in their website/services?)
5000 calculators may look excessive, but in this case it magnifies the AI capabilities in the future - both in terms of quality and quantity.
Well, I don't think all those people are spending their time making simple calculators.
Every other day I see ads of companies saying "use our AI and become a millionaire", this kind of marketing from agentic IDEs implies no need for developers who know their craft, which as said above, isn't the case.
If that’s the bar, there likely a ton of businesses that should shut down…
this by definition filters out all non-devs, even many junior devs as you need to understand deeply if those tests are correct and cover all important edge cases etc.
+ when you deploy it - you need to know it was properly deployed and your db creds are not on frontend.
But mostly no one cares as there is no consequences to leaking personal data of your users or whatnot.
If you just want to build a little web app, or a couple of screens for your phone, you'll probably be fine. (Unless there's money or personal data involved.) It's empowering! Have fun.
But if you're trying to build something that has a whole bunch of moving parts and which isn't allowed to be a trash fire? Someone needs to be paying attention.
I guess this is what separates some people. But I always explicitly tell it to use only HTML/JS/CSS without any libraries that I've vetted myself. Generating code allows you now not having to deal with it a lot more.
Cool to hear nonetheless. Can we now also stop stigmatizing AI generated music and art? Looking at you Steam disclosures.
This is a revolution, welcome back to coding :)