Why is so much data being written into DBQL?

 

There are thousands of queries submitted per minute and it is too much data.

But, To avoid the performance impact on the database due to these logged queries,We need to be pro-active and choosy about the data we want to log.

So we save the space, resources,performance and etc etc.

Note : DBQL can only be improvised but can’t be ignored.Its a pillar for Admins as well,helps in disasters 😀

But,if space and other issues are concerened, below tips can help you some:

1. DBQLRules need to be modified.

Example:

 

Suggested modifications:

Detail logging of non-tactical work

 

Summary threshold logging for tactical work by account name

 

Summary threshold logging for the Viewpoint user

 

2. Change TASM Logging Interval from 600 to 60 in Workload Designer General > Other > Intervals for logging better suited to tactical work

Untill next 🙂

Thanks & Regards,

Pankaj Chahar

Pankajchahar052@gmail.com

+91-8802350184

Pankaj Chahar

I'm an IT Professional from India, active Blogger. Have written many blogs for my previous firms,LinkedIn and own my blog. Professional Skills Primary : Teradata, Netezza, Unix Secondary : AWS Redshift ,PostgreSQL, MongoDB, Cassandra and other AWS services, EC2,S3,Cloudwatch,datapipeline and etc. You may reach me at pankajchahar052@gmail.com/ +91-8802350184

Leave a Reply