I have about 2 YoE, and I’m sure this changes with more experience.

I often hear this idea online that programmers should follow “just-in-time” learning, meaning you should prefer to learn things you don’t know while on the job. ( The way some people talk about it, though, it sounds like you shouldn’t dare spend a single minute learning anything new outside of your 9-5. )

This seems generally reasonable advice, especially for simpler things that take a few hours like learning a specific language feature, library, or similar. But when I lean too much on this JIT learning, it feels possibly detrimental.

Many times I do something big and new to me, say, deciding how to approach auth, microservice architecture design, automated testing, containerization, etc., I end up making a big decision after a few hours or days of cursory reading on documentation and blogs, only to come to regret it some months later. At that point, maybe I’ll slow down, find a book on the subject, read it, and think, “Oh, darn, I wish I knew that N months ago.” It certainly feels like spending more time learning upfront could have avoided mistakes due to lack of knowledge. Though there’s no way to go back in time and know for sure.

I’m not asking about any area listed in particular. I feel like, for all of those, I’ve learned more in the time since, and would probably avoid some of my prior mistakes if I did it again. The question is more: How much do you subscribe to this idea of just-in-time learning? And if you do, how do you know when you’ve learned enough to be confident, or when you need to slow down and learn in more depth?

  • abhibeckert@lemmy.world
    link
    fedilink
    arrow-up
    33
    ·
    edit-2
    10 months ago

    In my opinion the best developers are “generalists” who know a little bit about everything. For example I have never written a single line of code in the Rust programming language… but I know at a high level all of the major pros and cons of the language. And if I ever face a problem where I need some of those pros/don’t care about the cons then I will learn Rust and start using it for the first time.

    There’s not much benefit to diving deep into specialised knowledge on any particular technology because chances are you will live your entire life without ever actually needing that knowledge and if anything, it might encourage you to force a square peg into a round hole - for example “I know how to do this with X, so I’m going to use X even though Y would be a better choice”.

    Wikipedia has a list of “notable” programming languages, with 49 languages just under “A” alone and I’ve personally learned and used three of the “A” languages. I dislike all three, and I seriously hope I never use any of them ever again… but at the same time they were the best choice for the task I was trying to achieve and I would still use those languages if I was faced with the same situation again.

    That’s nowhere near a complete list - which would probably have a few thousand under “A” alone. I know one more “A” language which didn’t make Wikipedia’s cut.

    The reality is you don’t know what technology you need to learn until you actually need it. Even if you know something that could be used to solve a problem, you should not automatically choose that path. Always consider if some other tech would be a better choice.

    Since you’re just starting out, I do recommend branching outside your comfort zone and experimenting with things you’ve never done before. But don’t waste time going super deep - just cover the basics and then move on. If there’s a company you really want to work for, and they’re seeking skills you don’t have… then maybe get those skills. But it’s risky - the company might not hire you. Personally I would take a different approach, try to get a different job at the company first then after you’ve got that, start studying and ask your manager if they can help you transfer over to the job you previously weren’t qualified for, but are qualified now. In a well run company they will support you in that.

    As for learning outside of your 9-5… you should spend your spare time doing whatever you want. If you really want to spend your evenings and weekends writing code then go ahead and do that… but honestly, I think it’s more healthy long term to spend that time away from a desk and away from computers. I think it would be more productive, long term, to spend that time learning how to cook awesome food or do woodworking or play social football or play music… or of course the big one, find a partner, have kids, spend as much time with them as you can. As much as I love writing code, I love my kid more. A thousand times more.

    Programming is a job. It’s a fun job, but it’s still a job. Don’t let it be your entire life.

    • jeremyparker@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      Look Ma, this guy says it’s ok that I’m a full stack dev. He says it’s even good!

      Also: counterpoint: if you teach your kids to code, you can outsource to them.

      • abhibeckert@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        9 months ago

        this guy says it’s ok that I’m a full stack dev

        I’m also a full stack dev - so maybe I’m biased. But I’ll add that there’s definitely a place for specialist work, but I don’t agree, at all, with people who think specialist developers are better than full stack ones.

        The way I see it full stack devs either:

        • are good enough to be the only type of developer your hire; or
        • sit in between specialists and management

        Take OpenAI for example. They have a bunch of really smart people working on the algorithm - so much so they’re not even engineers they’re more like scientists, but they also have a full stack team who take that work and turn it into something users can actually interact with, and above the full stack team is the management team. Maybe OpenAI isn’t structured that way, but that’s how I’d structure it.

        But most software isn’t like ChatGPT. Most software isn’t bleeding edge technology - you can usually take open source libraries (or license proprietary ones). Let the specialists figure out how to make TCP/IP handle complex issues like buffer bloat… all the full stack dev needs to know is response = fetch(url).

  • Lmaydev@programming.dev
    link
    fedilink
    arrow-up
    24
    ·
    edit-2
    9 months ago

    I think what they mean is don’t spend your personal time learning stuff for work.

    If you need tech for work learn it at work.

    If you aren’t learning it we’ll enough at work spend more work time learning it.

    If they won’t give you that that’s their problem, not yours.

    I think the best advice is don’t make huge decisions based on a small amount of research tbh.

    Your personal programming shouldn’t be driven by work requirements.

  • Superb@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    9 months ago

    You should spend your time at work doing work and your time at home doing anything else. Some people really enjoying programming and building in their off time, some people don’t. Follow your interests :)

  • kersplort@programming.dev
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    9 months ago

    If you want to level up your game, find a new job, or grow into a new role, by all means take a course or training on your own time. All of the concerns that you listed are probably worth spending dedicated time to upskill on.

    If you stay in this field for much longer, you’re going to run into a lot of cases where the thing you’ve been doing is replaced with the New Thing. The New Thing will have a couple new ideas, but will also fundamentally handle the same concerns as your Old Thing, often in similar ways. Don’t spend your free time chasing the New Thing without getting something out of it - getting paid, making a project that you wanted to make anyways, contributing to New Thing open source projects.

    If you sink work into the New Thing without anyone willing to pay for it, that’s fine, but it means that you might never get someone to pay for it. Most companies are more than willing to hire experienced Old Thing devs on New Thing jobs, and will give you some time to skill up.

  • arran 🇦🇺@aussie.zone
    link
    fedilink
    arrow-up
    8
    ·
    10 months ago

    I feel like I have been doing this all my life. I think it’s more to do with the depth of understanding too. But the environment has to support it, if there is an expectation that everyone is an expert from day one, and there is no room for self improvement then it can’t be done.

    As stated there are down falls with the approach such as lack of exposure to new ideas. You still need to look just not study. But to me it’s also a work/life balance policy. But don’t practice it in extreme as it can hold you back. Good work places should allow for some learning time and I’m hoping that gets normalized.

  • driving_crooner@lemmy.eco.br
    link
    fedilink
    arrow-up
    5
    ·
    10 months ago

    Don’t know if my experience can be related by other people here. I’m a mathematician and I work in insurance writing python/SQL to transform data into knowledge. Everything python/SQL related I’m learning it on paid hours, like testing libraries or others. While outside my job I’m working on an actuarial sciences MBA, where I’m learning theoretical knowledge that couldn’t be learned “on the job”. When something I learn on the MBA can be used on the job, I rush to learn how to apply it on python (while being paid for it). For example, a couple of weeks ago we learned how to find the probability distribution parameters of a sample using maximum likelihood estimation, and while on the clock I learned how to that using scipy on the claims database.

    I’m already thinking on doing a computer science masters after finishing this MBA, because I’m really enjoying the opportunity of studying theoretical stuff on my time while being paid to practice it and learn how to apply it in real life applications.

  • Throwaway@lemm.ee
    link
    fedilink
    arrow-up
    4
    ·
    10 months ago

    My thoughts are spending too much time on the computer is bad for you, and once you find a good long term job, you dont need to learn so much. Youre 2 years into your career, hopefully you’ll find a good fit.

    Knowing how to Hello World in ADA is not very helpful, try to find in demand things to learn. There really isnt that much.

    Well, that, and programming itself is starting to slow down.

  • Kissaki@programming.dev
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    9 months ago

    Your description of JIT-learning sounded more like learn-only-on-the-job than JIT.

    When you say “should have learned more upfront” I don’t see how that would necessarily have to be outside or within the job. Where you learn it is open and arbitrary. For me, it’s a question of what’s reasonably necessary or efficient, does it make sense to explore more or prototype at work? Outside of that, I would have to have a personal interest in it to explore it in my private time just in time.

    Usually, I learn upfront unspecific to concrete work. And that experience and knowledge come in handy at work.

    When I’m on the job I work on and learn what is necessary to the degree it makes sense. I prefer to fully understand, but sometimes that’s not possible or reasonably, acceptably efficient. Then you can weigh risk vs cost and prospect. There’s no way around weighing that each time and individually. Development, in/as general, is too varied to have one arbitrary supposed weighing to follow.

  • Elise@beehaw.org
    link
    fedilink
    arrow-up
    3
    ·
    9 months ago

    I read books at work when I need to, and in some cases that can take several days. As long as it is relevant to what I am being paid to do. Make sure to clearly communicate. Learning is part of this job.

  • magic_lobster_party@kbin.run
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    10 months ago

    When you’re faced with a tough architectural decision, identify which option is easiest to change from. Part of the job is to make decisions with limited knowledge.

    The benefit of solutions that are easy to change is that if it turns out you made the wrong decision, you probably still have the chance to do some course correction.

    Programs are also dynamic. No decision should be final. Sooner or later you probably need to course correct the ship anyway when requirements change, and you will be grateful you chose the option that’s easy to change from.

    This principle is called ETC (Easy to Change) in the book Pragmatic Programmer, which is also the book that coined DRY (Don’t Repeat Yourself).

    • abhibeckert@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      10 months ago

      For some non-critical stuff you can experiment til you find something that appears to work, deploy it, and fix any issues that might appear. Too much of today’s Internet is done that way, but it really is ok sometimes

      For critical work, you can easily apply the same approach but replace the “deploy it” stage with “do extensive internal testing”. It takes a longer and is more expensive, but it does work. For example the first ever hydrogen powered aircraft flew in 1957, was an airplane with three engines and only one of those three ran on Hydrgoen. Almost 70 years of engineering later and that’s still the approach being used. Airbus claims they will have commercial hydrogen powered flights around 2035 and plan to flight test the final production engine next year on an A380 Aircraft.

      The A380 has four engines and each is powerful enough to fly safely with only one engine running. In fact, it should be able to land with four engine failures - with a “Ram Air Turbine” providing electricity and hydraulic pressure to critical systems.

      The best approach to critical systems is not to build a perfectly reliable system, but rather to have redundancy so that failures will not result in a “please explain” before congress.

      • solrize@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        10 months ago

        It’s a bit more complicated when security is involved. I deleted that post because it didn’t seem responsive enough to OP’s question but basically there is a big difference between stuff going wrong randomly (Murphy’s law) and smart determined adversaries trying to mess with you on purpose. Testing helps more with the former.

        • abhibeckert@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          10 months ago

          Sure — security is one area where you do need to be a specialist.

          I’d say it’s the exception that proves the rule though. Don’t write your own encryption algorithms, don’t invent new auth flows, do hire third parties to audit and test your security systems, etc etc. If you want to specialise in something like security, then yeah that’s something you should study. But at the same time - every programmer should have general knowledge in that area. Enough to know when it’s OK to write your own security code and when you need to be outsourcing it.