• Kissaki
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 天前

    After the first half, content repetition sets in, making me wonder about degree of LLM. I feel like at the end I read two things in particular three times.

    Either way, The first half or third was interesting and valuable.

    • Victor@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      3 天前

      I did some benchmarks a few months ago on iterator helpers, but unless I used them incorrectly, regular arrays were much more performant, maybe an order of magnitude, than supposedly lazy iterator helpers.

      Maybe it’s a poor implementation in Firefox? I don’t know.

      • towerful
        link
        fedilink
        arrow-up
        2
        ·
        3 天前

        I think it’s the take(10) that makes it performant.
        I guess things like any() will also short-circuit having to process the entire array through a chain of operators before acting on a subset of the final processed array.
        If that makes sense…

        But if you are filtering then mapping an array, an iterator won’t help. You need to go through every element of the source array (to filter) then every element of the filtered array (to map) anyway. There is no opportunity to short circuit, so an iterator won’t help.
        Edit: and I bet there is overhead keeping iterators active/alive and context switching as each new iteration gets requested, instead of batch processing an array then moving on to the next batch process.