AI services as additional docker on Start9 Server

Hello,

I like to setup my own AI server.
Since I still have 1 TB on my start9 server left, I was considering to install these service s(due a configuration chatgpt proposed) on my start9 server.

Chatgpt answered my security concerns as follows:

“Separate AI services logically and network-wise from the node – e.g., using separate IPs, network namespaces, or VPN containers.”

Any opinion / experience?

Thanks!

I’m not sure on your exact question.

I run FreeGPT-2 on my StartOS server. I like that it is running locally and does not use anything I provide as training data for big tech.

It is not as fast as ChatGPT, but it works well for me!

I want to set up another LLM, not just freechatgbt, in a docker.