Extract data from Kayako - auto refresh SOLVED

Use este forum para coisas que não tem nada a ver com Oracle.
  

Poston Wed, 29 Jan 2020 2:18 pm

A few days ago I had to collect supporting statistics (tickets) of Kayako tool.
As this tool is in the cloud, I have to connect via database and extract what I need.
also there is no API for it.

the best we have is KQL (which is a type of Kayako SQL). It's just very weak

was able to solve my problem as follows:
1. . I created a REPORT within Kayako with the columns that I needed to export daily
Something like:
Code: Select all
SELECT 'Users.User Organization',
'Users.Fullname',
'Tickets.Ticket Mask ID',
'Tickets.Status',
'Tickets.Priority',
'Tickets.Subject',
'Tickets.Ticket ID',
'Tickets.Department',
'Tickets.Last Activity',
'Ticket Posts.Ticket Post ID',
'Ticket Posts.Creation Date',
'Ticket Posts.Response Time',
'Ticket Posts.Email'
FROM 'Tickets','Ticket Posts', 'Users'
WHERE 'Tickets.Last Activity' = Last7Days()

In other words, he takes everything has changed in last 7 days. (To ensure lol)

2. I created a shell on Linux passing my login and pro Password Kayako and makes the request of my report:.
This script takes the parameters 1 and 2 the login and password
The report of Kayako's report 144, with CSV output.
Code: Select all
LOGIN_URL="https://aaaaaaa.kayako.com/staff/Core/Default/Login"
REPORT_URL="https://aaaaaaa.kayako.com/staff/Reports/Report/Export/144/CSV"
COOKIE="/tmp/cookie_kayako.txt"
OUTFILE="/tmp/kayako.csv"
LOGINUSER="${1}"
LOGINPASSWORD="${2}"

echo "**************************************************************************"
echo "******************* Login using Parameter 1 and 2 and saves the cookie"
echo "**************************************************************************"
curl --data "username=$LOGINUSER&password=$LOGINPASSWORD" -c $COOKIE $LOGIN_URL
echo

echo "**************************************************************************"
echo "******************* Request for the report (using cookie)"
echo "**************************************************************************"
curl --data "" -b $COOKIE $REPORT_URL >$OUTFILE
echo


3. The CSV file is saved to the Oracle server.
4. Created a EXTERNAL TABLE to read the CSV data export.
Code: Select all
create table xxtg.xxtg_ext_kayako_csv (
  user_organization varchar2(50)
,  full_name varchar2(100)
,  ticket_mask_id  varchar2(50)
,  status  varchar2(50)
,  priority  varchar2(255)
,  subject varchar2(255)
,  ticket_id number
,  department  varchar2(50)
,  last_activity varchar2(50)
,  ticket_post_id  number
,  creation_date varchar2(50)
,  response_time varchar2(50)
,  email varchar2(255)
)
Organization external
(type oracle_loader
default directory tmp
access parameters (records delimited by newline
                   fields terminated by ','
                   OPTIONALLY ENCLOSED BY '"' and '"'
                   LTRIM
                   )
location ('kayako.csv'))
reject limit 1000;


5. I made a MERGE such data to my actual table.

Ready! That way we can have updated data of Kayako automatically.
dr_gori
Location: Portland, OR USA

Thomas F. G

Você já respondeu a dúvida de alguém hoje?
http://glufke.net/oracle/search.php?search_id=unanswered


Return to Off-Topic

Who is online

Users browsing this forum: No registered users and 0 guests