Deploy Kafka Topic Message Produce

Application Scenario

Huawei Cloud Distributed Message Service Kafka is a highly available, highly reliable, and high-performance distributed message middleware service, widely used in big data, log collection, stream processing and other scenarios. Through Kafka topic message production functionality, you can send messages to specified Kafka topics to achieve reliable message transmission and processing. By using Terraform to automatically deploy Kafka topic message production, you can ensure the standardization and consistency of message production configuration and improve operational efficiency. This best practice will introduce how to use Terraform to automatically deploy Kafka topic message production, including creating Kafka instances, topics, and message production.

This best practice involves the following main resources and data sources:

Data Sources

Resources

Resource/Data Source Dependencies

Operation Steps

1. Script Preparation

Prepare the TF file (e.g., main.tf) in the specified workspace for writing the current best practice script, ensuring that it (or other TF files in the same directory) contains the provider version declaration and Huawei Cloud authentication information required for deploying resources. Refer to the "Preparation Before Deploying Huawei Cloud Resources" document for configuration introduction.

2. Query Data Sources

Add the following script to the TF file (e.g., main.tf) to query availability zone and Kafka flavor information:

Parameter Description:

  • type: Flavor type, assigned by referencing input variable instance_flavor_type, default value is "cluster" (cluster mode)

  • availability_zones: Availability zone list, assigned by referencing input variables or availability zones data source

  • storage_spec_code: Storage specification code, assigned by referencing input variable instance_storage_spec_code, default value is "dms.physical.storage.ultra.v2"

3. Create Basic Network Resources

Add the following script to the TF file (e.g., main.tf) to create VPC, subnet and security group:

4. Create Kafka Instance Resource

Add the following script to the TF file (e.g., main.tf) to instruct Terraform to create a Kafka instance resource:

Parameter Description:

  • name: Kafka instance name, assigned by referencing input variable instance_name

  • availability_zones: Availability zone list, assigned by referencing input variables or availability zones data source

  • engine_version: Engine version, assigned by referencing input variable instance_engine_version, default value is "2.7"

  • flavor_id: Flavor ID, assigned by referencing input variables or Kafka flavors data source

  • storage_spec_code: Storage specification code, assigned by referencing input variable instance_storage_spec_code, default value is "dms.physical.storage.ultra.v2"

  • storage_space: Storage space, assigned by referencing input variable instance_storage_space, default value is 600 (GB)

  • broker_num: Number of brokers, assigned by referencing input variable instance_broker_num, default value is 3

  • vpc_id: VPC ID, assigned by referencing the VPC resource

  • network_id: Network subnet ID, assigned by referencing the subnet resource

  • security_group_id: Security group ID, assigned by referencing the security group resource

  • access_user: Access user name, assigned by referencing input variable instance_access_user_name, optional parameter

  • password: Access password, assigned by referencing input variable instance_access_user_password, optional parameter

  • enabled_mechanisms: Enabled authentication mechanisms, assigned by referencing input variable instance_enabled_mechanisms, default value is ["PLAIN"]

  • port_protocol: Port protocol configuration, configured through dynamic blocks, supports multiple protocols for private and public network access

5. Create Kafka Topic Resource

Add the following script to the TF file (e.g., main.tf) to create a Kafka topic:

Parameter Description:

  • instance_id: Kafka instance ID, assigned by referencing the Kafka instance resource

  • name: Topic name, assigned by referencing input variable topic_name

  • partitions: Number of partitions, assigned by referencing input variable topic_partitions, default value is 10

  • replicas: Number of replicas, assigned by referencing input variable topic_replicas, default value is 3

  • aging_time: Aging time, assigned by referencing input variable topic_aging_time, default value is 72 (hours)

  • sync_replication: Sync replication, assigned by referencing input variable topic_sync_replication, default value is false

  • sync_flushing: Sync flushing, assigned by referencing input variable topic_sync_flushing, default value is false

  • description: Topic description, assigned by referencing input variable topic_description, optional parameter

  • configs: Topic configurations, configured through dynamic blocks, optional parameter

6. Create Kafka Message Produce Resource

Add the following script to the TF file (e.g., main.tf) to create a Kafka message produce resource to send messages to the topic:

Parameter Description:

  • instance_id: Kafka instance ID, assigned by referencing the Kafka instance resource

  • topic: Topic name, assigned by referencing the Kafka topic resource

  • body: Message body content, assigned by referencing input variable message_body

  • property_list: Message property list, configured through dynamic blocks, optional parameter, supports setting properties such as KEY, PARTITION, etc.

7. Preset Input Parameters Required for Resource Deployment (Optional)

In this practice, some resources use input variables to assign configuration content. These input parameters need to be manually entered during subsequent deployment. At the same time, Terraform provides a method to preset these configurations through tfvars files, which can avoid repeated input during each execution.

Create a terraform.tfvars file in the working directory with the following example content:

Usage:

  1. Save the above content as a terraform.tfvars file in the working directory (this filename allows users to automatically import the content of this tfvars file when executing terraform commands. For other naming, you need to add .auto before tfvars, such as variables.auto.tfvars)

  2. Modify parameter values according to actual needs, especially:

    • message_body needs to be set to the message content to be sent

    • message_properties can set message properties, such as KEY (message key), PARTITION (partition number), etc.

    • If the Kafka instance enables authentication, you need to configure instance_access_user_name and instance_access_user_password

    • instance_access_user_password needs to be set to a password that meets password complexity requirements

  3. When executing terraform plan or terraform apply, Terraform will automatically read the variable values in this file

In addition to using the terraform.tfvars file, you can also set variable values in the following ways:

  1. Command line parameters: terraform apply -var="message_body=Hello World" -var="topic_name=my_topic"

  2. Environment variables: export TF_VAR_message_body=Hello World and export TF_VAR_topic_name=my_topic

  3. Custom named variable file: terraform apply -var-file="custom.tfvars"

Note: If the same variable is set through multiple methods, Terraform will use variable values according to the following priority: command line parameters > variable file > environment variables > default values. Since password contains sensitive information, it is recommended to use environment variables or encrypted variable files for setting. In addition, ensure that the Kafka instance has been created and is in normal status, and the topic has been created, before messages can be successfully produced.

8. Initialize and Apply Terraform Configuration

After completing the above script configuration, execute the following steps to create Kafka topic message production:

  1. Run terraform init to initialize the environment

  2. Run terraform plan to view the resource creation plan

  3. After confirming that the resource plan is correct, run terraform apply to start creating Kafka instances, topics, and message production

  4. Run terraform show to view the details of the created message production

Note: After the message production resource is created, messages will be immediately sent to the specified Kafka topic. If message properties (such as PARTITION) are set, messages will be sent to the specified partition. The instance's availability zones and flavor ID cannot be modified after creation, so they need to be configured correctly during creation. Through lifecycle.ignore_changes, Terraform can be prevented from modifying these immutable parameters in subsequent updates.

Reference Information

Last updated