Is anybody aware of any self hosted alternatives to Parrot.ai or Otter.ai? I’ve tried these services and I’m finding them very useful, but the price tag is a little steep. It seems like something that the open source community could solve. Anybody know of any projects, either existing or upcoming? Thanks!

  • Cole@midwest.socialOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Thank you! I’ll make see if I can string together a few things to come up with my own homebrew version of these services. Honestly, for what they’re charging I think I can justify a new dedicated GPU. I’ve got a few other dockers/services which could take advantage of it anyway, so maybe this is the excuse I’ve been needing to pull the trigger on that purchase.

    • chmclhpby@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      LLaMa-2 was just released and the fine-tunings people have made of it are topping the leaderboards right now in terms of performance for an open source language model. As for inference, don’t forget to look into quantization so you can run larger models on limited vram. I’ve heard about vLLM and llama.cpp and its derivatives.

      If you’re looking for a GPU ~$300, I heard a used 3060 is better value than a 4060 right now on performance and memory throughout but not power efficiency (if you want an easy time with ML unfortunately the only option is nvidia).

      Good luck! Would be nice to get an update if you find a good solution, it seems could share your use case

      • Cole@midwest.socialOP
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        Thanks for the tip on the GPU! I live in an area where power is relatively cheap, so I’ll probably go for the 3060. I really wish some of these would work better with AMD since their drivers seem to be more Linux-Friendly these days.

        If I get something going, I’ll share for sure!