I am the developer of Summit for Lemmy.

  • 104 Posts
  • 805 Comments
Joined 3 years ago
cake
Cake day: June 13th, 2023

help-circle


  • I mostly don’t use AI… At least not directly for programming. I use it for other things like translating, formatting text, etc. i sometimes ask AI to make something for prototyping purposes.

    I will occasionally ask AI to solve programming problems, more to keep up with current trends. I like to keep informed with what AI can and cannot do because even if I choose not to use them, the same will not be true with my coworkers or other people I interact with. Having a good understanding of the current “meta” for AI lets me know what to look out for in the context of avoiding disasters.








  • I started by “making” frozen dinners, instant noodles and meal kits.

    The most complex stuff I’d make is stir frying with some stir fry sauce.

    Starting somewhere around 3rd year in college I decided I wanted to get better at cooking so I would look up a recipe for something I liked to eat once a month and try to make it.

    Once I graduated I realized I actually like cooking so I took the idea further and decided to make a new dish every week. I would research a dish, find a recipe that I thought looked good and then buy the ingredients the next time I’m at the grocery store. I practiced mise en place (ie. I would measure, wash and cut every ingredient before turning on the heat) and it really helped make every dish accessible.

    I did this for 10 years. Turns out if you consistently cook at least once a week for 10 years you make mistakes, learn and get better.

    I’m not as good as a chef and my knife skills suck but I like to think I can cook food as good as most restaurants. I also got to explore a large range of dishes and discovered a lot of foods I love and how to make it.



  • This is cooking advice.

    If you struggle with cooking or find that you mess up often, try preparing all of the individual ingredients before you start cooking. Eg. measure, wash, cut every ingredient. Apparently this practice is called mise en place.

    If you ever watch a cooking video and it looks so effortless this is probably why. It was a game changer back when I was learning to cook. Suddenly it felt like I could make every recipe with ease.

    This practice has drawbacks as it could dirty more dishes and increase cook times but it allows you to tackle most dishes at your own pace. I definitely recommend it whenever you make something new for the first time.



  • Maybe you have not played for a while but in the latest versions of minecraft, the caves are pretty insane specifically when talking about how varied they are in both size and depth.

    You can also realistically get a lot of diamonds by just caving now.

    I often times end up with a stack of diamonds just from exploring a cave without the intention of finding diamond at all.

    That being said...

    I’m a degenerate and start new worlds building farms and trading villagers for full diamond.



  • I’ve had this problem with abstractions for the longest time. Of course whenever I say anything negative about abstractions I just get dog piled so I don’t usually like to discuss the topic.

    I think abstractions as a tool is fine. My problem with abstractions is that most developers I meet seem to only talk about the upsides of abstractions and they never take into account the downsides seriously.

    More often then not, I just think people treat abstractions as this magical tool you cant over use. In reality, over use of abstractions can increase complexity and reduce readability. They can greatly reduce the amount of assumptions you can make about code which has many many additional downsides.

    Of course I’m not saying we shouldnt use abstractions. Not having any abstractions can be just as bad as having too many. You end up with similar issues such as increased complexity and reduced readability.

    The hard part is finding the balance, the sweet spot where complexity is minimized and readability is maximized while using the fewest amount of abstractions possible.

    I think too often, developers would err on the side of caution and add more abstractions then necessary and call it good enough. Developers really need to question if every abstraction is absolutely necessary. Is it really worth it to add an additional layer of abstraction just because a problem might arise in the future vs reducing the number of abstractions and waiting for it to become a problem before adding more abstractions. I don’t think we do the latter enough. Often times you can get away with slightly less abstractions than you think you need because you will never touch the code again.