My primary use case for Amber is when I need to write a Bash script but don’t remember the silly syntax. My most recent Bash mistake was misusing test -n and test -z. In Amber, I can just use something == "" or len(something) == 0

  • e8d79@discuss.tchncs.de
    link
    fedilink
    arrow-up
    21
    ·
    8 days ago

    This is an interesting idea and from the looks of it well executed, but I am having trouble imagining a scenario were I would prefer to use Amber over a scripting language like Python. If your bash scripts are getting long enough to warrant the use of Amber you are probably already in a situation where you can justify installing Python.

    • DoctorPress@lemmy.zip
      link
      fedilink
      arrow-up
      12
      ·
      8 days ago

      If your program relies too many external tools then I think it makes more sense to use bash than abuse os.system

    • themoken@startrek.website
      link
      fedilink
      arrow-up
      6
      ·
      8 days ago

      I agree, but I can envision scenarios where you are integrating into someone else’s workflow/machine and they (or their build system etc.) are expecting a shell script. Python is ubiquitous but sometimes you just want to work like everything else.

      • SinTan1729@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 days ago

        Also useful for scripts to be executed inside containers. Alpine doesn’t have Python installed by default.

    • sga@piefed.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 days ago

      to some extents - sunk cause fallacy and performance.

      I had a launcher script which required it’s run to complete under 50 ms to be usable. python just did not make the cut (it would call external stuff and more). I know i should not expect performancce from shell scripts, but started from essentially a find of about 20-30 files and a cat of at most 100 lines. It was fast enough then. then I kept adding stuff, it kept slowing down. I thought of converting to python, even did some initial bits, performance was not good.

      beyond a certain point, i kinda stopped caring optimising (i replaced bash with dash, made mos tif not all calls to external tools a minimum, and tried to minimise slow calls(think calling some tool again and again vs a vectorised operation, or not reading a variable multiple times in memory )). At some point it reached 300-400ms, and i decided to split it into 2 things - a executor part which cahes output of some commands with certain run conditions, and a launcher part which just read the output file (now almost a 1 miB).

      At some point i decided learning rust, and first thing i wrote was that launcher. implemented caching and run conditions better, moved larger files (now it read multiple megabytes(15+)) to /tmp dir, which is mounted in memory, and tried to minimise variable calls. now it lists 10 times more files, in less than a third or fifth of the time.

      tl;dr - a stupid person did not shift to a compiled program for a long time and kept optimising a shell script

      • SinTan1729@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 days ago

        Been there, done that lol. Nowadays, if I think that some script is getting too long, I just rewrite it in Go. It’s faster to write than Rust (in fact, I find it almost as easy as Python), and performant enough for replacing scripts.