-credit to nedroid for strange art

  • 50 Posts
  • 66 Comments
Joined 2 years ago
cake
Cake day: June 10th, 2023

help-circle



  • Yeah, I got my name when gmail was ‘invite only’ :/

    It actually kinda sucks. At any given time I have between 3 and a zillion idiots around the globe who, for months or years on end, keep buying concert tickets, airline/vacation bookings, get job hits, legal firm or health-care notifications … using my email ([email protected]) instead of ([email protected]) or whatever variant they actually signed up for, since I got ‘just my name’ and they keep. forgetting. their. own. bloody. email.

    And most of the time this shit is sent from a ‘[email protected]’ so I can’t even tell them they have the wrong email address. Grrrrrrr.









  • I feel this – we had a junior dev on our project who started using AI for coding, without management approval BTW (it was a small company and we didn’t yet have a policy specifically for it. Alas.)

    I got the fun task, months later, of going through an entire component that I’m almost certain was ‘vibe coded’ – it “worked” the first time the main APIs were called, but leaked and crashed on subsequent calls. It used double- and even triple-pointers to data structures, which the API vendor’s documentation upon some casual reading indicated could all be declared statically and re-used (this was an embedded system); needless arguments; mallocs and frees everywhere for no good reason (again due to all of the un-needed dynamic storage involving said double/triple pointers to stuff). It was a horrible mess.

    It should have never gotten through code review, but the senior devs were themselves overloaded with work (another, separate problem) …

    I took two days and cleaned it all up, much simpler, no mem leaks, and could actually be, you know, used more than once.

    Fucking mess, and LLMs (don’t call it “AI”) just allow those who are lazy and/or inexperienced to skate through short-term tasks, leaving huge technical debt for those that have to clean up after.

    If you’re doing job interviews, ensure the interviewee is not connected to LLMs in any way and make them do the code themselves. No exceptions. Consider blocking LLMs from your corp network as well and ban locally-installed things like Ollama.