Table with 300 million records

Tuning de Banco, Tuning de SQL, Ferramentas de tuning
  

Poston Wed, 27 Apr 2016 5:41 pm

Good Afternoon Folks.

I have a table with more than 300 million records and there is a query that uses that table to generate a report. The problem is that this query is taking too long and consuming a lot of resources I'vê tried BD perform some alternative tuning in this query but I couldn't improve it. I'm out of options, would you have any tips on this? Does partition this table would be a solution?

I thank you.
dejambo

Poston Fri, 06 May 2016 10:22 am

A solution when the table is large, you can partition the table, if there is a field with date, for example you can partition a year.
nor.dbajunior
Location: cuiaba

Poston Tue, 10 May 2016 1:13 pm

Exactly.
Another way is to make efficient use of NULLs.
Remember that NULL fields are not entered in the indexes.

An example of what I'm talking about: Let's say you want to retrieve all the rows not yet processed a table. (table with 2 billion rows, and rows that you want to recover are 200 or 300).
In this case, you create a STATUS field and let this field NULL when the registration has been processed.
And let ' P ' for the record that you want to process. Then, simply create an index on this field.
That is, how the NULLS will not enter in the index, this query will be super fast, because the index will only have 200 rows. Exactly what you need, and will not be necessary to scan an index of 2 billion rows, nor do FULL SCAN).
dr_gori
Location: Seattle, WA, USA

Thomas F. G
https://www.patreon.com/glufke



Return to DBA Tuning

Who is online

Users browsing this forum: No registered users and 1 guest