LLM Providers
LLM Providers supported
The following configurations can be used when creating the LLMConfig CR. These will be referenced from the Remediator CR.
AWS Bedrock
Using Pod Identity Agent (Recommended if running in an EKS cluster):
Create an IAM role with a trust policy for the Pod Identity Agent.
aws iam create-role \
--role-name remediator-agent-role \
--assume-role-policy-document '{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": { "Service": "pods.eks.amazonaws.com" },
"Action": [ "sts:AssumeRole", "sts:TagSession" ]
}
]
}'
Give the role permission to invoke Bedrock models.
aws iam put-role-policy \
--role-name remediator-agent-role \
--policy-name BedrockInvokePolicy \
--policy-document '{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "BedrockInvoke",
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"bedrock:InvokeModelWithResponseStream"
],
"Resource": "arn:aws:bedrock:<AWS_REGION>:<AWS_ACCOUNT_ID>:application-inference-profile/<BEDROCK_INFERENCE_PROFILE>"
}
]
}'
Bind the IAM role to your Kubernetes ServiceAccount using Pod Identity. Replace <CLUSTER_NAME> and <ACCOUNT_ID> with your actual cluster name and accound id.
aws eks create-pod-identity-association \
--cluster-name <CLUSTER_NAME> \
--namespace nirmata \
--service-account remediator-agent \
--role-arn arn:aws:iam::<ACCOUNT_ID>:role/remediator-agent-role
Verify the association.
aws eks list-pod-identity-associations \
--cluster-name <CLUSTER_NAME>
Create the LLMConfig
CR.
apiVersion: serviceagents.nirmata.io/v1alpha1
kind: LLMConfig
metadata:
name: remediator-agent-llm
namespace: nirmata
spec:
type: bedrock
bedrock:
model: MODEL_ARN_OR_INFERENCE_ARN
region: AWS_REGION
Using credentials:
Create a Kubernetes secret in the nirmata
namespace with your AWS credentials.
kubectl create secret generic aws-bedrock-credentials \
--from-literal=aws_access_key_id=AWS_ACCESS_KEY_ID \
--from-literal=aws_secret_access_key=AWS_SECRET_ACCESS_KEY \
--from-literal=aws_session_token=AWS_SESSION_TOKEN
-n nirmata
Create the LLMConfig
CR.
apiVersion: serviceagents.nirmata.io/v1alpha1
kind: LLMConfig
metadata:
name: remediator-agent-llm
namespace: nirmata
spec:
type: bedrock
bedrock:
model: MODEL_ARN_OR_INFERENCE_ARN
region: AWS_REGION
secretRef:
name: aws-bedrock-credentials
key: aws_access_key_id
Azure OpenAI
Create a Kubernetes secret in the nirmata
namespace with your Azure credentials.
kubectl create secret generic azure-openai-credentials \
--from-literal=api-key=AZURE_API_KEY \
-n nirmata
Create the LLMConfig
CR.
apiVersion: serviceagents.nirmata.io/v1alpha1
kind: LLMConfig
metadata:
name: remediator-agent-llm
namespace: nirmata
spec:
type: azure-openai
azureOpenAI:
endpoint: https://YOUR_RESOURCE_NAME.openai.azure.com/
deploymentName: DEPLOYMENT_NAME
secretRef:
name: azure-openai-api-key
key: api-key
namespace: nirmata