Scheduling Telemetry Upload
If you want to upload telemetry data on a schedule, you can use the following example as an aid to set up a Cron Job that runs the telemetry command for you on a regular basis. The recommended frequency for data upload is every three months.
For more information around the gathered data, collection purposes, and data storage location, see Telemetry.
A bundle is tied to a specific cluster. If you want to set up a schedule for more than one cluster in your environment, set up a CronJob for each cluster individually.
This feature is currently not available in air-gapped environments.
Prerequisites
Your environment must have a valid license. See Add a DKP license for more information.
Your host machine (where the commands are executed) must have access to the Internet.
Ensure your firewall rules allow HTTPS calls and connections to the https://api.secure-upload.d2iq.cloud/secure-upload endpoint.
Your environment must have a valid license. See Add a DKP license for more information.
Your host machine (where the commands are executed) must have access to the Internet.
Ensure your firewall rules allow HTTPS calls and connections to the https://api.secure-upload.d2iq.cloud/secure-upload endpoint.
Access to the cluster’s
kubeconfig
file.Your cluster nodes must have access to the Internet.
Create and Enable the CronJob
The following example shows a Kubernetes CronJob that manages running a command on an established schedule inside the cluster.
Create a file with the schedule using the Cron Syntax. In this example, the job is called
telemetry
and runs four times a year, starting on the first day of the third month from the configuration date.CODEapiVersion: batch/v1 kind: CronJob metadata: name: telemetry spec: schedule: "0 0 1 */3 *" jobTemplate: spec: template: spec: serviceAccountName: telemetry-collector restartPolicy: OnFailure containers: - name: telemetry image: alpine:3.6 imagePullPolicy: IfNotPresent env: - name: DKP_VERSION value: v2.7.0 command: - /bin/sh - -c - | apk update && apk add curl curl -L https://downloads.d2iq.com/dkp/${DKP_VERSION}/dkp_${DKP_VERSION}_linux_amd64.tar.gz | tar -xz ./dkp upload telemetry --bundle-profile diagnostics --- apiVersion: v1 kind: ServiceAccount metadata: name: telemetry-collector --- apiVersion: rbac.authorization.k8s.io/v1 kind: ClusterRole metadata: name: telemetry-collector rules: - apiGroups: ["*"] resources: ["*"] verbs: ["get", "watch", "list"] --- apiVersion: rbac.authorization.k8s.io/v1 kind: ClusterRoleBinding metadata: name: telemetry-collector namespace: default roleRef: apiGroup: rbac.authorization.k8s.io kind: ClusterRole name: telemetry-collector subjects: - kind: ServiceAccount namespace: default name: telemetry-collector
Specify a different
--bundle-profile
flag if you want to limit the amount of shared data. See Sending Telemetry Data for Analysis for more information on the telemetry profile.Apply the created file to the cluster where you would like to automate the command. Replace the placeholder
<target_cluster_kubeconfig>
with the path to the target cluster'skubeconfig
file:CODEkubectl apply -f CronJob.yaml --kubeconfig=<target_cluster_kubeconfig>