Help - Query Optimization

Tuning de Banco, Tuning de SQL, Ferramentas de tuning
Post Reply
marcus.asse
Rank: Estagiário Pleno
Rank: Estagiário Pleno
Posts: 4
Joined: Thu, 28 Feb 2013 10:49 am

Good afternoon.
I would like a help if possible for a possible tuning of a simple query without filters.

Query is this:

Select all

select * from sms_safx42_spo;
The problem of this query is that it returns around 10 million records , the result of it I am recording in a file via UTL_File.

Follow execution plan:

Select all

SELECT STATEMENT REMOTE, GOAL = ALL_ROWS	Id=0	Operation=SELECT STATEMENT	Cost=8105	CPU cost=3727389226	Bytes=1682697087 
TABLE ACCESS FULL	Id=1	Operation=TABLE ACCESS	Cost=8105	CPU cost=3727389226	Bytes=1682697087 
Is there any way to improve it?

Detail: This table is accessed via db_link
Michel Pessoa
Rank: Estagiário Pleno
Rank: Estagiário Pleno
Posts: 4
Joined: Mon, 18 Nov 2013 2:24 pm

Good afternoon,

My tip is for you to use a Hint. The HINT I will suggest dividing processing into more than one processing core. In the example below I am putting in 2 cores.

Select all

select /*+ parallel(2) */ * from sms_safx42_spo;
Pass feedback Please.
DanielNN
Moderador
Moderador
Posts: 641
Joined: Mon, 03 Sep 2007 3:26 pm
Location: Fortaleza - CE
att,

Daniel N.N.

It is possible that the largest bottleneck is in the file recording and not in the consultation itself.
Trevisolli
Moderador
Moderador
Posts: 2016
Joined: Wed, 12 Jan 2005 3:25 pm
Location: Araraquara - SP
Abraço,

Trevisolli
OCA Oracle PL/SQL Developer Certified Associate
OCP Oracle Forms Developer Certified Professional
Araraquara-SP

Speech Brother, blzinha?


I do not know if it has already solved your problem (if positive, put the solution here for us), however, it is worth taking a look at the case of db_link.

A tip would be for you to save this txt, run the script on your "_SPO" and then transfer the txt. Would?

If you can not, can you bring these records from x in x rows in a local table and from this table resolve the export of the file?

embrace,

Trevisolli
souldeath
Rank: Programador Sênior
Rank: Programador Sênior
Posts: 48
Joined: Wed, 25 Aug 2010 9:39 am
Location: Limeira

Perhaps this problem has already been solved, but I will leave my contribution, as Trevisolli commented in the post above to bring the X records in x lines, I thought of using Bulk Collect.
This way you can return multiple records in a single fetch, there is one however, this feature uses more memory, then it is important to check the size X of lines to bring by fetch.

follows a sample code that I did:

Select all

 
PROCEDURE TESTE AS 
    -- 
    CURSOR curOrders is 
      select order_id, 
             order_date, 
             order_mode, 
             customer_id, 
             order_status, 
             order_total, 
             sales_rep_id, 
             promotion_id, 
             warehouse_id, 
             delivery_type, 
             cost_of_delivery, 
             wait_till_all_available, 
             delivery_address_id, 
             customer_class, 
             card_id, 
             invoice_address_id 
      from   orders; 
    -- 
    TYPE tblOrders is table of orders%ROWTYPE INDEX BY PLS_INTEGER; 
    clOrders       tblOrders; 
    -- 
    vLimit PLS_INTEGER := 4000; 
    -- 
  BEGIN 
    -- 
    open curOrders; 
      -- 
      loop 
        -- 
        fetch curOrders 
        bulk  collect into clOrders 
        limit vLimit; 
        -- 
        dbms_output.put_line(to_char(clOrders.count)); 
        -- 
        exit when clOrders.count < vLimit; 
        -- 
      end loop; 
      -- 
    close curOrders; 
    -- 
  END TESTE; 
Post Reply
  • Information
  • Who is online

    Users browsing this forum: No registered users and 14 guests