Publishing Audit Events
Minimum Required Role: Navigator Administrator (also provided by Full Administrator)
Audit events can be published to a Kafka topic or to the system log (syslog). After configuring Cloudera Navigator to send audit events, failures to send events are logged to the Audit Server log.
Continue reading:
Publishing Audit Events to Kafka
Navigator can stream audit events to a Kafka topic so the events can be used by another application or archived outside of the Navigator Audit Server database. To configure Navigator to stream audits to a Kafka topic:
- Create the topic in Kafka.
If the Kafka instance is configured with Topic Auto Creation enabled, you can skip this step. The topic will be created when the first event is sent to Kafka.
For steps for creating a Kafka topic, see the Kafka kafka-topics command in Using Apache Kafka Command-line Tools.
- Log in to Cloudera Manager Admin Console using either Navigator Administrator or Full Administrator privileged account.
- Select .
- Click the Configuration tab.
- If Kafka is managed by the same Cloudera Manager instance as Navigator:
- In the Search field, type "kafka" to display the configurable settings.
- For the Kafka Service property, select the Kafka service to which Cloudera Navigator will publish audit events.
- For the Kafka Topic property, enter the name of the topic to which Cloudera Navigator will publish the audit events.
- If Kafka is not managed by the same Cloudera Manager instance as Navigator, add properties to accommodate logging for Kafka:
- In the Search field, type "logging" to display the configurable settings.
- For the Navigator Audit Server Logging Advanced Configuration Snippet (Safety Valve) property, insert the following logging properties, inserting the
brokerList and topic values with the values for your system:
log4j.logger.kafkaAuditStream=TRACE,KAFKA log4j.appender.KAFKA=org.apache.kafka.log4jappender.KafkaLog4jAppender log4j.additivity.kafkaAuditStream=false log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout log4j.appender.KAFKA.layout.ConversionPattern=%m%n log4j.appender.KAFKA.syncSend=false log4j.appender.KAFKA.brokerList=<comma-separated list of brokers as host:port> log4j.appender.KAFKA.topic=<topic>
If Kafka is managed using Cloudera Manager, you can find the broker host names in the Instances tab for the Kafka service. List the host and port numbers in a comma-separated list without spaces. For example:
kafka-dedicated-1.example.com:9092,kafka-dedicated-2.example.com:9092,kafka-dedicated-3.example.com:9092
- If Kafka is not managed by the same Cloudera Manager instance as Navigator and Kerberos is enabled in the Kafka cluster, there are additional
configuration steps.
These steps are described in the Kafka client documentation for security, Configuring Apache Kafka Security.
- Include the following additional properties in the Navigator Audit Server Logging Advanced Configuration Snippet (Safety Valve), inserting the
truststore location and truststore password with the values from your system:
log4j.appender.KAFKA.SaslKerberosServiceName=kafka log4j.appender.KAFKA.clientJaasConfPath=navigator.jaas.conf log4j.appender.KAFKA.SecurityProtocol=SASL_SSL log4j.appender.KAFKA.SslTruststoreLocation=</path/to/truststore> log4j.appender.KAFKA.SslTruststorePassword=<password>
- For the Navigator Kafka client create a jaas.conf file with keytabs.
To generate keytabs, see Step 6: Get or Create a Kerberos Principal for Each User Account).
Insert the keyTab and principal with the values from your system.KafkaClient { com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true keyTab="/etc/security/keytabs/mykafkaclient.keytab" principal="mykafkaclient/clients.hostname.com@EXAMPLE.COM"; };
- Set the location of the jaas.conf in the Navigator host environment.
export KAFKA_OPTS="-Djava.security.auth.login.config=/home/user/jaas.conf"
- Include the following additional properties in the Navigator Audit Server Logging Advanced Configuration Snippet (Safety Valve), inserting the
truststore location and truststore password with the values from your system:
- Add a description of the changes and click Save Changes.
- Restart the role.
- You can validate that events are published to the topic using the Kafka kafka-console-consumercommand as described in Using Apache Kafka Command-line Tools.
Publishing Audit Events to Syslog
The Audit Server logs all audit records into a Log4j logger called auditStream. The log messages are logged at the TRACE level, with the attributes of the audit records. By default, the auditStream logger is inactive because the logger level is set to FATAL. It is also connected to a NullAppender, and does not forward to other appenders (additivity set to false).
To record the audit stream, configure the auditStream logger with the desired appender. For example, the standard SyslogAppender allows you to send the audit records to a remote syslog.
$ModLoad imudp $UDPServerRun 514 # Accept everything (even DEBUG messages) local2.* /my/audit/trail.log
- Log in to Cloudera Manager Admin Console using either Navigator Administrator or Full Administrator privileged account.
- Select .
- Click the Configuration tab.
- Locate the Navigator Audit Server Logging Advanced Configuration Snippet property by typing its name in the Search box.
- Depending on the format type, enter:
log4j.appender.SYSLOG = org.apache.log4j.net.SyslogAppender log4j.appender.SYSLOG.SyslogHost = hostname[:port] log4j.appender.SYSLOG.Facility = local2 log4j.appender.SYSLOG.FacilityPrinting = true
To configure the specific stream type, enter:Format Property JSON log4j.logger.auditStream = TRACE,SYSLOG log4j.additivity.auditStream = false
RSA EnVision log4j.logger.auditStreamEnVision = TRACE,SYSLOG log4j.additivity.auditStreamEnVision = false
- Click Save Changes.
- Restart the role.