What IT infrastructure is required to deploy LLM

Discuss smarter ways to manage and optimize cv data.
Post Reply
rakhirhif8963
Posts: 549
Joined: Mon Dec 23, 2024 3:15 am

What IT infrastructure is required to deploy LLM

Post by rakhirhif8963 »

However, he notes that private models need buy-in from all stakeholders in an organization, and encourages IT leaders considering deploying private LLMs to conduct a risk assessment before implementing them.

“When deploying them, companies should have clearly defined policies for their use,” adds King-Smith. “As with any other critical IT resource, access control for key personnel must be ensured, especially if they are working with sensitive information.”

For example, companies that need to comply with standards such as ITAR, GDPR and HIPPA should check that their LLMs are compliant. As examples of accidental abuse, King-Smith cites cases where lawyers have been caught preparing cases in ChatGPT, a clear breach of attorney-client privilege.

The main advantage of private LLMs over public ones, he says, is that they can look at internal information stored in email, internal documents, project management systems and other data sources within an organization. “That rich repository built into your private model enhances its ability to work within an enterprise,” King-Smith says.

Working with private LLMs means that the organization’s internal IT department is responsible for maintaining the hardware and software. LLM training is performed using an array of graphics processing units (GPUs) installed on AI-optimized servers.

Many organizations choose not to host servers on-premises, but to use external hardware accessed through a public infrastructure as a service (IaaS) provider. For example, retail giant Walmart uses china mobile database public cloud providers and its own generative AI stack.

“It’s really important to us that our user, customer, and intellectual property data stays within our firewall and isn’t used to supplement other training data sets, so we spend a lot of time figuring out how to do that,” says Walmart senior vice president David Glick, who leads the retailer’s enterprise business services.

Using a public cloud to host and run private LLMs allows IT managers to avoid the risk of data breaches that can occur if proprietary data is loaded into a public LLM. A private LLM running on a cloud infrastructure allows organizations to take advantage of the scalability and elasticity of the public cloud while maintaining the security of their proprietary data.

The industry has recognized that there is a demand to provide IT infrastructure and platforms optimized for AI. Major server providers have changed their offerings to accommodate ML and LLM workloads.

“Companies see the potential of GenAI and want to achieve the results that come with it. But they are hesitant to move their data to the cloud because their internal data is becoming part of the public domain,” says Paulo Perreira, vice president of systems engineering at Nutanix.
Post Reply