I wish our company forced AI on us. Our security is so tight, it's pretty much impossible to use any good LLMs.
It really doesn't take that beefy of a machine to run a good LLM locally instead of paying some SaaS company to do it for you.
I've got a refurb homelab server off PCSP with 512gb ram for <$1k, and I run decently good LLM models (Deepseek-r1:70b, llama3.3:70b). Given your username, you might even try pitching a GPU server to them as dual-purpose; LLM + hashcat. :)
How would that help me? My work laptop doesn't have 512GB RAM, not even 10% of that.
Because if your company is against using LLMs for work based on security concerns, it's usually the concern that an employee will enter company confidential data into the LLM, which when using a SaaS LLM means exposing the data.
But if your company buys a server to run it themselves, that security risk is not present.